<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[AI Impacts blog]]></title><description><![CDATA[A blog from AI Impacts about the future of artificial intelligence]]></description><link>https://blog.aiimpacts.org</link><generator>Substack</generator><lastBuildDate>Thu, 16 Apr 2026 07:50:07 GMT</lastBuildDate><atom:link href="https://blog.aiimpacts.org/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[AI Impacts]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[aiimpacts@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[aiimpacts@substack.com]]></itunes:email><itunes:name><![CDATA[Rick Korzekwa]]></itunes:name></itunes:owner><itunes:author><![CDATA[Rick Korzekwa]]></itunes:author><googleplay:owner><![CDATA[aiimpacts@substack.com]]></googleplay:owner><googleplay:email><![CDATA[aiimpacts@substack.com]]></googleplay:email><googleplay:author><![CDATA[Rick Korzekwa]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[FAQ: Expert Survey on Progress in AI methodology]]></title><description><![CDATA[Context]]></description><link>https://blog.aiimpacts.org/p/faq-expert-survey-on-progress-in</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/faq-expert-survey-on-progress-in</guid><dc:creator><![CDATA[Katja Grace]]></dc:creator><pubDate>Fri, 31 Oct 2025 16:48:51 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/e823c20a-44cd-4994-a759-fc1f6989717f_1880x1476.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2><strong>Context</strong></h2><p>The Expert Survey on Progress in AI (ESPAI) is a big survey of AI researchers that I&#8217;ve led four times&#8212;in<a href="https://arxiv.org/abs/1705.08807"> 2016</a>, then annually:<a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai#summary_of_results"> 2022</a>,<a href="https://arxiv.org/html/2401.02843v1"> 2023</a>, and 2024 (results coming soon!)</p><p>Each time so far it&#8217;s had substantial attention&#8212;the first one was the<a href="https://web.archive.org/web/20200125021243/https://www.altmetric.com/top100/2017/#list&amp;article=20475804"> 16th</a> &#8216;most discussed&#8217; paper in the world in 2017.</p><p>Various misunderstandings about it have proliferated, leading to the methodology being underestimated in terms of robustness and credibility (perhaps in part due to insufficient description of the methodology&#8212;the<a href="https://blog.aiimpacts.org/p/what-do-ml-researchers-think-about-ai-in-2022"> 2022 survey blog post</a> was terse). To avoid these misconceptions muddying interpretation of the 2024 survey results, I&#8217;ll answer key questions about the survey methodology here.</p><p>This covers the main concerns I know about. If you think there&#8217;s an important one I&#8217;ve missed, please tell me&#8212;in comments or by email (katja@aiimpacts.org).</p><p>This post throughout discusses the 2023 survey, but the other surveys are very similar. The biggest differences are that a few questions have been added over time, and we expanded from inviting respondents at two publication venues to six in 2023. The process for contacting respondents (e.g. finding their email addresses) has also seen many minor variations.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ygti!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ygti!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 424w, https://substackcdn.com/image/fetch/$s_!ygti!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 848w, https://substackcdn.com/image/fetch/$s_!ygti!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!ygti!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ygti!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:407577,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.aiimpacts.org/i/162866535?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ygti!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 424w, https://substackcdn.com/image/fetch/$s_!ygti!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 848w, https://substackcdn.com/image/fetch/$s_!ygti!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!ygti!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d0d574b-4608-405d-ba5a-53b4fe3f6f93_3200x1800.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Summary of some (but not all) important questions addressed in this post.</figcaption></figure></div><h2><strong>How good is this survey methodology overall?</strong></h2><p>To my knowledge, the methodology is substantially stronger than is typical for surveys of AI researcher opinion. For comparison, <a href="https://zenodo.org/records/15118399">O&#8217;Donovan et al</a> was reported on by <a href="https://www.nature.com/articles/d41586-025-01123-x">Nature Briefing</a> this year, and while 53% larger, its methodology appeared worse in most relevant ways: its response rate was 4% next to 2023 ESPAI&#8217;s 15%; it doesn&#8217;t appear to report efforts to reduce or measure non-response bias, the survey population was selected by the authors and not transparent, and 20% of completed surveys were apparently excluded (<a href="https://docs.google.com/document/d/1xib8FAkSN0sD_qtjwksBwP7e_BgGydnBNvivBV2Livo/edit?tab=t.0">see here</a> for a fuller comparison of the respective methodologies).</p><p>Some particular strengths of the ESPAI methodology:</p><ul><li><p><strong>Size: </strong>The survey is big&#8212;2778 respondents in 2023, which was the largest survey of AI researchers that had been conducted at the time.</p></li><li><p><strong>Bias mitigations: </strong>We worked hard to minimize the kinds of things that create non-response bias in other surveys&#8212;</p><ul><li><p>We wrote individually to every contactable author for a set of top publication venues rather than using less transparent judgments of expertise or organic spread.</p></li><li><p>We obscured the topic in the invitation.</p></li><li><p>We encouraged a high response rate across the board through payments, a pre-announcement, and many reminders. We have experimented over the years with details of these approaches (e.g. how much to pay respondents) in order to increase our response rate.</p></li></ul></li><li><p><strong>Question testing: </strong>We tested and honed questions by asking test-respondents to take the survey while talking aloud to us about what they think the questions mean and how they think about answering them.</p></li><li><p><strong>Testing for robustness over framings: </strong>For several topics, we ask similar questions in several different ways, to check if respondents&#8217; answers are sensitive to apparently unimportant framing choices. In earlier iterations of the survey, we found that responses are sensitive to framing, and so have continued including all framings in subsequent surveys. We also highlight this sensitivity in our reporting of results.</p></li><li><p><strong>Consistency over time: </strong>We have used almost entirely identical questions every year so can accurately report changes over time (the exception is e.g. several minor edits to task descriptions for ongoing accuracy, and the addition of new questions).</p></li><li><p><strong>Non-respondent comparison: </strong>In 2023, we looked up available demographic details for a large number of non-respondents so that we could measure the representativeness of the sample along these axes.</p></li></ul><p>Criticisms of the ESPAI methodology seem to mostly be the result of basic misunderstandings, as I&#8217;ll discuss below.</p><h2><strong>Did lots of respondents skip some questions?</strong></h2><p>No.</p><p>Respondents answered nearly all the normal questions they saw (excluding demographics, free response, and conditionally-asked questions). Each of these questions was answered by on average 96% of those who saw it, with the most-skipped still at 90% (a question about the number of years until the occupation of &#8220;AI researcher&#8221; would be automatable).<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><p>The reason it could look like respondents skipped a lot of questions is that in order to ask more distinct questions, we intentionally only directed a fraction (5-50%) of respondents to most questions. We selected those respondents randomly, so the smaller pool answering each question is an unbiased sample of the larger population.</p><p>Here&#8217;s the map of paths through the survey and how many people were given which questions in 2023. Respondents start at the introduction then randomly receive one of the questions or sub-questions at each stage. The randomization shown below is uncorrelated, except that respondents get either the &#8220;fixed-years&#8221; or &#8220;fixed-probabilities&#8221; framing throughout for questions that use those, to reduce confusion.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uoi3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uoi3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 424w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 848w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uoi3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg" width="1152" height="1416" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1416,&quot;width&quot;:1152,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uoi3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 424w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 848w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!uoi3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29aa4f8-aada-45dd-917d-cad66088c614_1152x1416.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Map of randomization of question blocks in the 2023 survey. Respondents are allocated randomly to one place in each horizontal set of blocks.</figcaption></figure></div><p></p><h2><strong>Did only a handful of people answer extinction-related questions?</strong></h2><p>No. There were two types of question that mentioned human extinction risk, and roughly everyone who answered any question&#8212;97% and 95% of respondents respectively, thousands of people&#8212;answered some version of each.</p><p>Confusion on this point likely arose because there are three different versions of one question type&#8212;so at a glance you may notice that only about a quarter of respondents answered a question about existential risk to humanity within one hundred years, without seeing that the other three quarters of respondents answered one of two other very similar questions<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a>. (As discussed in the previous section, respondents were allocated randomly to question variants.) All three questions got a median of 5% or 10% in 2023.</p><p>This system of randomly assigning question variants lets us check the robustness of views across variations on questions (such as &#8216;...within a hundred years&#8217;), while still being able to infer that the median view across the population puts the general risk at 5% or more (if the chance is 5% within 100 years, it is presumably at least 5% for all time)<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a>.</p><p>In addition, every respondent was assigned another similar question about the chance of &#8216;extremely bad outcomes (e.g. human extinction)&#8217;, providing another check on their views.</p><p>So the real situation is that every respondent who completed the survey was asked about outcomes similar to extinction in two different ways, across four different precise questions. These four question variations all got similar answers&#8212;in 2023, median 5% chance of something like human extinction (one question got 10%). So we can say that across thousands of researchers, the median view puts at least a 5% chance on extinction or similar from advanced AI, and this finding is robust across question variants.</p><h2><strong>Did lots of people drop out, biasing answers?</strong></h2><p>No. Of people who answered at least one question in the 2023 survey, 95% got to seeing the final demographics question at the end<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a>. And, as discussed above, respondents answered nearly all the (normal) questions they saw.</p><p>So there is barely any room for bias from people dropping out. Consider the extinction questions: even if most pessimistically, exactly the least concerned 5% of people didn&#8217;t get to the end, and least concerned 5% of those who got there skipped the second extinction-related question (everyone answered the first one), then the real medians are what look like the 47.5th and 45th percentiles now for the two question sets, which are still both 5%.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a></p><p>On a side note: one version of this concern envisages the respondents dropping out in disapproval at the survey questions being focused on topics like existential risk from AI, systematically biasing the remaining answers toward greater concern for that. This concern suggests a misunderstanding about the content of the survey. For instance, half of respondents were asked about everyday risks from AI such as misinformation, inequality, empowering authoritarian rulers or dangerous groups before extinction risk from AI was even mentioned as an example (Q2 vs. Q3). The other question about existential risk is received at the end.</p><h2><strong>Did the survey have a low response rate?</strong></h2><p>No. The 2023 survey was taken by 15% of those we contacted.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> This appears to be broadly typical or high for a survey such as this.</p><p>It seems hard to find clear evidence about typical response rates, because surveys can differ in so many ways. The best general answer we got was an analysis by<a href="https://www.yumpu.com/en/document/view/11869710/online-survey-response-rates-and-times-supersurvey"> Hamilton (2003)</a>, which found the median response rate across 199 surveys to be 26%. They also found that larger invitation lists tended to go with lower response rates&#8212;surveys sent to over 20,000 people, like ours, were expected to have a response rate in the range of 10%. And specialized populations (such as scientists) also commonly had lower response rates.</p><p>For another comparison, <a href="https://zenodo.org/records/15118399">O&#8217;Donovan et al 2025</a> was a recent, similarly sized survey of AI researchers which used similar methods of recruiting and got a 4% response rate.</p><h2><strong>Are the AI risk answers inflated much from concerned people taking the survey more?</strong></h2><p>Probably not.</p><p>Some background on the potential issue here: the ESPAI generally reports substantial probabilities on existential risk to humanity from advanced AI (&#8216;AI x-risk&#8217;)&#8212;the median probability of human extinction or similar has always been at least 5%<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> (across different related questions, and years). The question here is whether these findings represent the views of the broad researcher population, or if they are caused by massive bias in who responds.</p><p>There are a lot of details in understanding why massive bias is unlikely, but in brief:</p><ol><li><p>As discussed above, very few people drop out after answering any questions, and very few people skip each question, so there cannot be much <strong>item non-response</strong> bias from respondents who find AI risk implausible dropping out or skipping questions.</p></li><li><p>This leaves the possibility of <strong>unit non-response bias</strong>&#8212;skewed results from people with certain opinions systematically participating less often. We limit the potential for this by writing individually to each person, with an invitation that doesn&#8217;t mention risk. Recipients might still infer more about the survey content through recognizing our names, recognizing the survey from previous years, following the links from the invitation, or looking at 1-3 questions without answering. However for each of these there is reason to doubt they produce a substantial effect. The AI safety community may be overrepresented, and some people avoid participating because of other views, however these groups appear to be a small fraction of the respondent pool. We also know different demographic subsets participate at different rates, and have somewhat different opinions on average. Nonetheless, it is highly implausible that the headline result&#8212;5% median AI x-risk&#8212;is caused by disproportionate participation from researchers who are strongly motivated by AI x-risk concerns, because the result is robust to excluding everyone who reports thinking about the social impacts of AI even moderately.</p></li></ol><h4><strong>The large majority of respondents answer every question&#8212;including those about AI x-risks</strong></h4><p>One form of non-response bias is item nonresponse, where respondents skip some questions or drop out of the survey. In this case, the concern would be that unconcerned respondents skip questions about risk, or drop out of the survey when they encounter such questions. But this can only be a tiny effect here&#8212;in 2023 ~95% of people who answered at least one question reached the end of the survey. (See section &#8220;Did lots of people drop out when they saw the questions&#8230;&#8221;). If respondents were leaving due to questions about (x-)risk, we would expect fewer respondents to have completed the survey.</p><p>This also suggests low <em>unit</em> non-response bias among unconcerned members of the sample: if people often decide not to participate if they recognize that the survey includes questions about AI x-risk, we&#8217;d also expect more respondents to drop out when they encounter such questions (especially since most respondents should not know the topic before they enter the survey&#8212;see below). Since very few people drop out upon seeing the questions, it would be surprising if a lot of people had dropped out earlier due to anticipating the question content.</p><h4><strong>Most respondents don&#8217;t know there will be questions about x-risk, because the survey invitation is vague</strong></h4><p>We try to minimize the opportunity for unit non-response bias by writing directly to every researcher we can who has published in six top AI venues rather than having people share the survey, and making the invitation vague: avoiding directly mentioning anything like risks at all, let alone extinction risk. For instance, the 2023 survey invitation describes the topic as &#8220;the future of AI progress&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a>.</p><p>So we expect most sample members are not aware that the survey includes questions about AI risks until after they open it.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yXXb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yXXb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 424w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 848w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yXXb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg" width="1456" height="983" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:983,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!yXXb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 424w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 848w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!yXXb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbdf819a8-2227-4639-afd7-7c90e238e5f0_1456x983.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><strong>2023 Survey invitation (though sent after<a href="https://docs.google.com/document/d/16JwJVBlv9PBgz1HGRuZg6Sxr-e1eR8Zy7TjD4yNAcv0/edit?tab=t.0#heading=h.qo5pn56uh8zm"> this</a> pre-announcement, which does mention my name and additional affiliations)</strong></figcaption></figure></div><p>There is still an opportunity for non-response bias from some people deciding not to answer the survey after opening it and looking at questions. However only around 15% of people who look at questions leave without answering any, and these people can only see the first three pages of questions before the survey requires an answer to proceed. Only the third page mentions human extinction, likely after many such people have left. So the scale of plausible non-response bias here is small.</p><h4><strong>Recognition of us or our affiliations is unlikely to have a large influence</strong></h4><p>Even in a vague invitation, some respondents could still be responding to our listed affiliations connecting us with the AI Safety community, and some recognize us.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> This could be a source of bias. However different logos and affiliations get similar response rates<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a>, and it seems unlikely that very many people in a global survey have been recognizing us, especially since 2016 (when the survey had a somewhat higher response rate and the same median probability of extremely bad outcomes as in 2023)<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a>. Presumably some people remember taking the survey in a previous year, but in 2023 we expanded the pool from two venues to six, reducing the fraction who might have seen it before, and got similar answers on existential risk (see<a href="https://arxiv.org/pdf/2401.02843v1"> p14</a>).</p><p>As confirmation that recognition of us or our affiliations is not driving the high existential risk numbers, recognition would presumably be stronger in some demographic groups than others, e.g. people who did undergrad in the US over Europe or Asia, and probably industry over academia, yet when we checked in 2023, all these groups gave median existential risk numbers of at least 5%<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a>.</p><h4><strong>Links to past surveys included in the invitation do not foreground risk.</strong></h4><p>Another possible route to recipients figuring out there will be questions about extinction risk is that we do link to past surveys in the invitation. However the linked documents (from 2022 or 2023) also do not foreground AI extinction risk, so this seems like a stretch.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a></p><p>So it should be hard for most respondents to decide if to respond based on the inclusion of existential risk questions.</p><h4><strong>AI Safety researchers are probably more likely to participate, but there are few of them</strong></h4><p>A big concern seems to be that members of &#8220;the AI (Existential) Safety community&#8221;, i.e. those whose professional focus is reducing existential risk from AI, are more likely to participate in the survey. This is probably true&#8212;anecdotally, people in this community are often aware of the survey and enthusiastic about it, and a handful of people wrote to check that their safety-interested colleagues have received an invitation.</p><p>However this is unlikely to have a strong effect, since the academic AI Safety community is quite small compared to the number of respondents.</p><p>One way to roughly upper-bound the fraction of respondents from the AI Safety community is to note that they are very likely to have &#8216;a particular interest&#8217; in the &#8216;social impacts of smarter-than-human machines&#8217;. However, when asked &#8220;How much thought have you given in the past to social impacts of smarter-than-human machines?&#8221; only 10.3% gave an answer that high.</p><h4><strong>People decline the survey for opinion reasons, but probably few</strong></h4><p>As well as bias from concerned researchers being motivated to respond to the survey, at the other end of the spectrum there can be bias from researchers who are motivated to particularly avoid participating for reasons correlated with opinion. I know of a few of instances of this, and a tiny informal poll suggested it could account for something like 10% of non-respondents<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a>, though this seems unlikely, and even if so, this would have a small effect on the results.</p><h4><strong>Demographic groups differ in propensity to answer and average opinions</strong></h4><p>We have been discussing bias from people&#8217;s opinions affecting whether they want to participate. There could also be non-response bias from other factors influencing both opinion and desire to participate. For instance, in 2023 we found that women participated around 66% of the base rate, and generally expected less extreme positive or negative outcomes. This is a source of bias, however since women were only around one in ten of the total population, the scale of potential error from this is limited.</p><p>We similarly measured variation in the responsiveness of some other demographic groups, and also differences of opinion between these demographic groups among those who did respond, which together give some heuristic evidence of small amounts of bias. Aside from gender, the main dimension where we noted a substantial difference in response rate and also in opinion was for people who did undergraduate study in Asia. They were only 84% as likely as the base rate to respond, and in aggregate expected high level machine intelligence earlier, and had higher median extinction or disempowerment numbers. This suggests an unbiased survey would find AI to be sooner and more risky. So while it is a source of bias, it is in the opposite direction to that which has prompted concern.</p><h4><strong>Respondents who don&#8217;t think about AI x-risk report the same median risk</strong></h4><p>We have seen various evidence that people engaged with AI safety do not make up a large fraction of the survey respondents. However there is another strong reason to think extra participation from people motivated by AI safety does not drive the headline 5% median, regardless of whether they are overrepresented. We can look at answers from a subset of people who are unlikely to be substantially drawn by AI x-risk concern: those who report not having thought much about the issue. (If someone has barely ever thought about a topic, it is unlikely to be important enough to them to be a major factor in their decision to spend a quarter of an hour participating in a survey.) Furthermore, this probably excludes most people who would know about the survey or authors already, and so potentially anticipate the topics.</p><p>We asked respondents, &#8220;How much thought have you given in the past to social impacts of smarter-than-human machines?&#8221; and gave them these options:</p><ul><li><p>Very little. e.g. &#8220;I can&#8217;t remember thinking about this.&#8221;</p></li><li><p>A little. e.g. &#8220;It has come up in conversation a few times&#8221;</p></li><li><p>A moderate amount. e.g. &#8220;I read something about it now and again&#8221;</p></li><li><p>A lot. e.g. &#8220;I have thought enough to have my own views on the topic&#8221;</p></li><li><p>A great deal. e.g. &#8220;This has been a particular interest of mine&#8221;</p></li></ul><p>Looking at only respondents who answered &#8216;a little&#8217; or &#8216;very little&#8217;&#8212;i.e. those who had at most discussed the topic a few times&#8212;the median probability of &#8220;human extinction or similarly permanent and severe disempowerment of the human species&#8221; from advanced AI (asked with or without further conditions) was 5%, the same as for the entire group. Thus we know that people who are highly concerned about risk from AI are not responsible for the median x-risk probability being at least 5%. Without them, the answer would be the same.</p><h2><strong>Is the survey small?</strong></h2><p>No, it is large.</p><p>In 2023 we wrote to around 20,000 researchers&#8212;everyone whose contact details we could find from six top AI publication venues (NeurIPS, ICML, ICLR, AAAI, IJCAI, and JMLR).<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> We heard back from 2778. As far as we could tell, it was the largest ever survey of AI researchers at the time. (It could be this complaint was only made about the 2022 survey, which was 738 respondents before we expanded the pool of invited authors from two publication venues&#8212;NeurIPS and ICML&#8212;to six, but I&#8217;d say that was also pretty large<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a>.)</p><h2><strong>Is the survey biased to please funders?</strong></h2><p>Unlikely: at a minimum, there is little incentive to please funders.</p><p>The story here would be that we, the people running the survey, might want results that support the views of our funders, in exchange for their funding. Then we might adjust the survey in subtle ways to get those answers.</p><p>I agree that where one gets funding is a reasonable concern in general, but I&#8217;d be surprised if it was relevant here. Some facts:</p><ul><li><p>It has been easy to find funding for this kind of work, so there isn&#8217;t much incentive to do anything above and beyond to please funders, let alone extremely dishonorable things.</p></li><li><p>A lot of the funding specifically raised for the survey is for paying the respondents. We pay them to get a high response rate and reduce non-response bias. It would be weird to pay $100k to reduce non-response bias, only to get yourself socially obligated to bring about non-response bias (on top of the already miserable downside of having to send $50 to 2000 people across lots of countries and banking situations). It&#8217;s true that the first effect is more obvious, so this might suffice if we just wanted to look unbiased, but in terms of our actual decision-making as I have observed it, it seems like we are weighing up a legible reduction in bias against effort and not thinking about funders much.</p></li></ul><h2><strong>Is it wrong to make guesses about highly uncertain future events without well-supported quantitative models?</strong></h2><p>One criticism is that even AI experts have no valid technical basis for making predictions about the future of AI. This is not a criticism of the survey methodology, per se, but rather a concern that the results will be misinterpreted or taken too seriously.</p><p>I think there are two reasons it is important to hear AI researchers&#8217; guesses about the future, even where they are probably not a reliable forecast.</p><p>First, it has often been assumed or stated that nobody who works in AI is worried about AI existential risk. If this were true, it would be a strong reason for the public to be reassured. However hearing the real uncertainty from AI researchers disproves this viewpoint, and makes a case for serious investigation of the concern. In this way even uncertain guesses are informative, because they let us know that the default assumption in confident safety was mistaken.</p><p>Second, there is not an alternative to making guesses about the future. Policy decisions are big bets on guesses about the future, implicitly. For instance when we decide whether to rush a technology or to carefully regulate it, we are guessing about the scale of various benefits and the harms.</p><p>Where trustworthy quantitative models are available, of course those are better. But in the absence, the guesses of a large number of relatively well-informed people is often better than the unacknowledged guesses of whoever is called upon to make implicit bets on the future.</p><p>That said, there seems little reason to think these forecasts are highly reliable&#8212;they should be treated as rough estimates, often better responded to by urgent more dedicated analysis of the issues they hazily outline over acting on the exact numbers.</p><h2><strong>When people say &#8216;5%&#8217; do they mean a much smaller chance?</strong></h2><p>The concern here is that respondents are not practiced at thinking in terms of probabilities, and may consequently say small numbers (e.g. 5%) when they mean something that would be better represented by an extremely tiny number (perhaps 0.01% or 0.000001%). Maybe especially if the request for a probability prompts them to think of integers between 0 and 100.</p><p>One reason to suspect this kind of error is that <a href="https://static1.squarespace.com/static/635693acf15a3e2a14a56a4a/t/64f0a7838ccbf43b6b5ee40c/1693493128111/XPT.pdf">Karger et al. (2023, p29)</a> found a group of respondents gave extinction probabilities nearly six orders of magnitude lower when prompted differently.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a></p><p>This seems worth attending to, but I think unlikely to be a big issue here for the following reasons:</p><ul><li><p>If your real guess is 0.0001%, and you feel that you should enter an integer number, the natural inputs would seem to be 0% or 1%&#8212;it&#8217;s hard to see how you would get to 5%.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a></p></li><li><p>It&#8217;s hard to square 5% meaning something less than 1% with the rest of the distribution for the extinction questions&#8212;does 10% mean less than 2%? What about 50%? Where the median response is 5%, many respondents gave these higher numbers, and it would be strange if these were also confused efforts to enter minuscule numbers, and also strange if the true distribution had a lot of entries greater than 10% but few around 5%. (See<a href="https://arxiv.org/pdf/2401.02843v1"> Fig. 10</a> in the paper for the distribution for one extinction-relevant question.)</p></li><li><p>Regarding the effect in Karger et al. (2023) specifically, this has only been observed once to my knowledge, and is extreme and surprising. And even if it turned out to be real and widespread, including in highly quantitatively educated populations, it would be a further question whether the answers produced by such prompting are more reliable than standard ones. So it seems premature to treat this as substantially undermining respondents&#8217; representations of their beliefs.</p></li><li><p>It would surprise me if that effect were widely replicable and people stayed with the lower numbers, because in that world I&#8217;d expect people outside of surveys to often radically reduce their probabilities of AI x-risk when they give the question more thought (e.g. enough to have run into other examples of low probability events in the meantime). Yet survey respondents who have previously given a &#8220;lot&#8221; or a &#8220;great deal&#8221; of thought to the social impacts of smarter-than-human machines give similar and somewhat higher numbers than those who have thought less. (See<a href="https://arxiv.org/pdf/2401.02843v1"> A.2.1</a>)</p></li></ul><h2><strong>What do you think are the most serious weaknesses and limitations of the survey?</strong></h2><p>While I think the quality of our methodology is exceptionally high, there are some significant limitations of our work. These don&#8217;t affect our results about expert concern about risk of extinction or similar, but do add some noteworthy nuance.<br><br><strong>1) Experts&#8217; predictions are inconsistent and unreliable</strong><br>As we&#8217;ve emphasized in our papers reporting the survey results, experts&#8217; predictions are often inconsistent across different question framings&#8212;such sensitivity is not uncommon, and we&#8217;ve taken care to mitigate this by using multiple framings. Experts also have such a wide variety of different predictions on many of these questions that they must each be fairly inaccurate on average (though this says nothing about whether as a group their aggregate judgments are good).<br><br><strong>2) It is not entirely clear what sort of &#8220;extremely bad outcomes&#8221; experts imagine AI will cause</strong></p><p>We ask two different types of questions related to human extinction: 1) a question about &#8220;extremely bad outcomes (e.g. human extinction)&#8221;, 2) questions about &#8220;human extinction or similarly permanent and severe disempowerment of the human species&#8221;. We made the latter broader than &#8216;human extinction&#8217; because we are interested in scenarios that are effectively the end of humanity, rather than just those where literally every <em>homo sapiens</em> is dead. This means however that it isn&#8217;t clear how much probability participants place on literal extinction versus adjacent strong human disempowerment and other extremely bad scenarios. And there is some evidence that the fraction is low: some respondents explicitly mentioned risks other than extinction in write-in responses, and anecdotally, it seems common for AI researchers to express more concern about issues other than human extinction.</p><p>For many purposes, it isn&#8217;t important to distinguish between extinction and outcomes that are similarly extremely bad or disempowering to humanity. Yet if the catastrophes many participants have in mind are not human extinction, but the results lend themselves to simplification as &#8216;risk of extinction&#8217;, this can be misleading. And perhaps more than you&#8217;d expect, if for instance &#8216;extinction&#8217; tends to bring to mind a different set of causes than &#8216;permanent and severe human disempowerment&#8217;.</p><p><strong>3) Non-response bias is hard to eliminate</strong></p><p>Surveys generally suffer from some non-response bias. We took many steps to minimize this, and find it implausible that our results are substantially affected by whatever bias remains (see the earlier question &#8220;Are the AI risk answers inflated much from concerned people taking the survey more?&#8221;). But we could do even more to estimate or eliminate response bias, e.g. paying some respondents much more than $50 to complete the survey and estimating the effect of doing so.</p><h2><strong>Is this the kind of low quality research that couldn&#8217;t get into an academic journal?</strong></h2><p>No. We published the near-identical 2016 survey in the Journal of AI Research, so the methodology had essentially been peer reviewed.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a> Publication is costly and slow, and AI survey results are much more interesting sooner than later.</p><p>The <a href="https://www.jair.org/index.php/jair/article/view/19087">2023 paper</a> was actually also just published, but in the meantime you had <a href="https://arxiv.org/html/2401.02843v1">the results</a> more than a year earlier!</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>See <a href="https://arxiv.org/pdf/2401.02843v1">Appendix D in the paper</a>.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>A random subset of respondents also gets asked additional open response questions after questions shown, and which respondents receive each of these is correlated.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>The three variants of the extinction question (differences in bold): <br><br><em>What probability do you put on <strong>future AI advances</strong> causing human extinction or similarly permanent and severe disempowerment of the human species?</em></p><p><em>What probability do you put on <strong>human inability to control future advanced AI systems</strong> causing human extinction or similarly permanent and severe disempowerment of the human species?</em></p><p><em>What probability do you put on <strong>future AI advances</strong> causing human extinction or similarly permanent and severe disempowerment of the human species <strong>within the next 100 years</strong>?</em><br><br>See <a href="https://wiki.aiimpacts.org/_media/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_espai_paid.pdf">here</a> for all the survey questions.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>If some people say a risk is &#8805;5% ever, and some say it is &#8805;5% within a hundred years, and some say it is &#8805;5% from a more specific version of the problem, then you can infer that the whole group thinks the chance ever from all versions of the problem is at least &#8805;5%.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>See the figure above for the flow of respondents through the survey, or<a href="https://arxiv.org/pdf/2401.02843v1"> Appendix D in the paper</a> for more related details</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>I&#8217;m combining the three variants in the second question set for simplicity.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>15% entered any responses, 14% got to the last question.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Across three survey iterations and up to four questions: 2016 [5%], 2022 [5%, 5%, 10%], 2022 [5%, 5%, 10%, 5%]; see p14 of<a href="https://arxiv.org/pdf/2401.02843v1"> the 2023 paper</a>. Reading some of the write-in comments we noticed a number of respondents mention outcomes in the &#8216;similarly bad or disempowering&#8217; category.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>See invitations<a href="https://docs.google.com/document/d/16JwJVBlv9PBgz1HGRuZg6Sxr-e1eR8Zy7TjD4yNAcv0/edit?tab=t.0#heading=h.gxl3g2uzwj9"> here</a>.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Four mentioned my name, &#8216;Katja&#8217; in the write-in responses in 2024, and two of those mentioned there that they were familiar with me. I usually recognize a (very) small fraction of the names, and friends mention taking it.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>In 2022 I sent the survey under different available affiliations and logos (combinations of Oxford, the Future of Humanity Institute, the Machine Intelligence Research Institute, AI Impacts, and nothing), and these didn&#8217;t seem to make any systematic difference to response rates.  The combinations of logos we tried all got similar response rates (8-9%, lower than the ~17% we get after sending multiple reminders).  Regarding affiliations, some combinations got higher or lower response rates, but not in a way that made sense except as noise (Oxford + FHI was especially low, Oxford was especially high). This was not a careful scientific experiment: I was trying to increase the response rate, so also varying other elements of the invitation, and focusing more on variants that seemed promising so far (sending out tiny numbers of surveys sequentially then adjusting). That complicates saying anything precise, but if MIRI or AI Impacts logos notably encouraged participation, I think I would have noticed.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>I&#8217;m not sure how famous either is now, but respondents gave fairly consistent answers about the risk of very bad outcomes across the three surveys starting in 2016&#8212;when I think MIRI was substantially less famous, and AI Impacts extremely non-famous.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>See Appendix A.3 of<a href="https://arxiv.org/pdf/2401.02843"> our 2023 paper</a></p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>2023 links: the<a href="https://arxiv.org/abs/1705.08807"> 2016 abstract</a> doesn&#8217;t mention it, focusing entirely on timelines to AI performance milestones, and the<a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai#expert_survey_on_progress_in_ai"> 2022 wiki page</a> is not (I think) a particularly compelling read and doesn&#8217;t get to it for a while. 2022 link: the<a href="https://scholar.google.com/citations?view_op=view_citation&amp;hl=en&amp;citation_for_view=PUQJdUsAAAAJ:9yKSN-GCB0IC"> 2016 survey Google Scholar page</a> doesn&#8217;t mention it.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>In 2024 we included a link for non-respondents to quickly tell us why they didn&#8217;t want to take the survey. It&#8217;s not straightforward to interpret this (e.g. &#8220;don&#8217;t have time&#8221; might still represent non-response bias, if the person would have had time if they were more concerned), and only a handful of people responded out of tens of thousands, but 2/12 cited wanting to prevent consequences they expect from such research among multiple motives (advocacy for slowing AI progress and &#8216;long-term&#8217; risks getting attention at the expense of &#8216;systemic problems&#8217;).</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Most machine learning research is published in conferences. NeurIPS, ICML, and ICLR are widely regarded as the top-tier machine learning conferences; AAAI and IJCAI are often considered &#8220;tier 1.5&#8221; venues, and also include a wider range of AI topics; JMLR is considered the top machine learning journal.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>To my knowledge the largest at the time, but I&#8217;m less confident there.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p>Those respondents were given some examples of (non-AI) low probability events, such as that there is a 1-in-300,000 chance of being killed by lightning, and then asked for probabilities in the form &#8216;1-in-X&#8217;</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p>It wouldn&#8217;t surprise me if in fact a lot of the 0% and 1% entries would be better represented by tiny fractions of a percent, but this is irrelevant to the median and nearly irrelevant to the mean.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p>Differences include the addition of several questions, minor changes to questions that time had rendered inaccurate, and variations in email wording.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Reanalyzing the 2023 Expert Survey on Progress in AI]]></title><description><![CDATA[With new charts, and a newly open-source codebase]]></description><link>https://blog.aiimpacts.org/p/reanalyzing-the-2023-expert-survey</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/reanalyzing-the-2023-expert-survey</guid><dc:creator><![CDATA[Ben Weinstein-Raun]]></dc:creator><pubDate>Mon, 16 Dec 2024 06:08:37 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!IbPX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IbPX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IbPX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 424w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 848w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IbPX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:256661,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IbPX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 424w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 848w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!IbPX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6e22cad-d95b-4949-8e63-63c31780b215_2400x1800.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">An illustration of the central range of expert responses, when asked about the timeline of automating all human labor.</figcaption></figure></div><p>There&#8217;s <a href="https://aiimpacts.org/how-should-we-analyse-survey-forecasts-of-ai-timelines/">a new report</a> on the AI Impacts web site, that focuses on reanalyzing the data from the 2023 Expert Survey on Progress in AI (originally written up in <a href="https://arxiv.org/abs/2401.02843">Thousands of AI Authors on the Future of AI</a>).<br></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W-vN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W-vN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 424w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 848w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 1272w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W-vN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png" width="1493" height="1096" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1096,&quot;width&quot;:1493,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:611754,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!W-vN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 424w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 848w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 1272w, https://substackcdn.com/image/fetch/$s_!W-vN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F45c26693-b9eb-46e4-98ff-d5e323728e1a_1493x1096.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A (mostly) analogous chart to the above, this one from the original paper</figcaption></figure></div><p>The report, by <a href="https://tadamcz.com/">Tom Adamczewski</a>, introduces several improvements over the earlier analysis. Even better, it comes with a brand new <a href="https://github.com/tadamcz/espai/">open-source codebase</a>, also by Tom, that anyone can use to perform their own analyses on the data.<br><br>I found the report illuminating, and I recommend it to anyone interested in the tradeoffs and options available in eliciting, analyzing, and presenting quantitative estimates from experts. Plus there are many excellent new graphs!</p>]]></content:encoded></item><item><title><![CDATA[Winners of the Essay competition on the Automation of Wisdom and Philosophy]]></title><description><![CDATA[We&#8217;re delighted to announce the winners of the Essay competition on the Automation of Wisdom and Philosophy.]]></description><link>https://blog.aiimpacts.org/p/winners-of-the-essay-competition</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/winners-of-the-essay-competition</guid><dc:creator><![CDATA[Owen Cotton-Barratt]]></dc:creator><pubDate>Mon, 28 Oct 2024 16:58:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jNDm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We&#8217;re delighted to announce the winners of the <em><a href="https://blog.aiimpacts.org/p/essay-competition-on-the-automation">Essay competition on the Automation of Wisdom and Philosophy</a></em>.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jNDm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jNDm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 424w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 848w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 1272w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jNDm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jNDm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 424w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 848w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 1272w, https://substackcdn.com/image/fetch/$s_!jNDm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb057ea80-211e-4a32-8321-62fc85b4c9e9_1600x914.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1>Overview</h1><p>The competition attracted 90 entries in total (only one of which was obviously just the work of an LLM!), taking a wide variety of angles on the topic. The judges awarded the top four prizes as follows:</p><ul><li><p>$7,000 to Rudolf Laine for essays on wisdom, amortised optimisation, and AI:</p><ul><li><p>Part I: <em><a href="https://rudolf.website/wisdom1/">Wisdom, amortised optimisation, and AI</a></em></p></li><li><p>Part II: <em><a href="https://rudolf.website/wisdom2/">Growth and amortised optimisation</a></em></p></li><li><p>Part III: <em><a href="https://rudolf.website/wisdom3/">AI effects on amortised optimisation</a></em></p></li></ul></li><li><p>$6,000 to Thane Ruthenis for <em><a href="https://aiimpacts.org/towards-the-operationalization-of-philosophy-wisdom/">Towards the operationalization of philosophy &amp; wisdom</a></em></p></li><li><p>$4,000 to Chris Leong for essays on training wise AI systems:</p><ul><li><p><em><a href="https://aiimpacts.org/an-overview-of-obvious-approaches-to-training-wise-ai-advisors/">An Overview of &#8220;Obvious&#8221; Approaches to Training Wise AI Advisors</a></em></p></li><li><p><em><a href="https://aiimpacts.org/some-preliminary-notes-on-the-promise-of-a-wisdom-explosion/">Some Preliminary Notes on the Promise of a Wisdom Explosion</a></em></p></li></ul></li><li><p>$3,000 to Gabriel Recchia for <em><a href="https://thediscontinuity.substack.com/p/should-we-just-be-building-more-datasets">Should we just be building more datasets?</a></em></p></li></ul><p>Additionally, the judges awarded ten runner-up prizes, of $500 each. These adjustments to the prize schedule were made to better reflect the judges&#8217; assessments &#8212; see footnote for details<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>.&nbsp;</p><p>Many of those essays which did not get prizes still had something valuable to recommend them, and we are very grateful for the deep thought participants put into engaging with these potentially-crucial topics.</p><p>In the rest of this post, we&#8217;ll link to all fourteen of the prize-winning entries, and provide judges&#8217; commentary on these. We have made a point to include critical commentary as well as praise. In our view this is an essential part of the competition &#8212; helping to draw collective attention to the ideas people have presented, and (we hope) helping to advance the discourse on what makes for valuable work.&nbsp;</p><h2>Judge introductions</h2><p><strong>Andreas Stuhlm&#252;ller (AS)</strong> &#8212; Hi, I'm CEO &amp; cofounder of Elicit, an AI company working on scaling up high-quality reasoning, starting with science. I've been interested in how AI can differentially advance wisdom for a long time, and (pre LLMs) founded the non-profit Ought to work on that topic.</p><p><strong>Linh Chi Nguyen (CN) &#8212; </strong>Hi, I have been thinking about long-term risk from AI besides alignment for a few years. I landed on thinking a bunch about AIs&#8217; decision theories, which is what I&#8217;m currently doing. In that process, I&#8217;ve thought a bit more generally about the philosophical capabilities and inclinations of AI systems and I&#8217;m excited about more work in this area.</p><p><strong>Bradford Saad (BS)</strong> &#8212; Hi, I&#8217;m a philosopher and senior research fellow at Oxford&#8217;s Global Priorities Institute. Most of my past work is in philosophy of mind. But for the past few years I have been working on AI and catastrophic risks, including the risk that future AI agents will cause catastrophes as a result of making philosophical errors. (I&#8217;ll add that my comments are not as developed and well-put as I&#8217;d like and that with apologies I&#8217;ve only managed to write comments for some winning entries in time for the announcements.)</p><p><strong>David Manley (DM)</strong> &#8212; Hi, I&#8217;m an Associate Professor of Philosophy at the University of Michigan, Ann Arbor. I&#8217;ve worked in philosophical semantics, ontology, epistemology, and global priorities research.&nbsp;</p><p>Due to timing issues, Wei Dai withdrew from the formal judging.</p><h1>Top Prize Winners</h1><p>Note that after the winners had been chosen, we provided them with feedback for the judges, and gave them an opportunity to revise their submissions before this announcement. The judges&#8217; comments are generally based on the original submitted versions.</p><h2>Wisdom, amortised optimisation, and AI &#8212; Rudolf Laine &#8212; $7,000</h2><p>This was submitted as one essay, and split into three parts during the revision process, to improve readability:</p><ul><li><p>Part I: <a href="https://rudolf.website/wisdom1/">Wisdom, amortised optimisation, and AI</a></p></li><li><p>Part II: <a href="https://rudolf.website/wisdom2/">Growth and amortised optimisation</a></p></li><li><p>Part III: <a href="https://rudolf.website/wisdom3/">AI effects on amortised optimisation</a></p></li></ul><h3>Summary</h3><p>A lot of wisdom is about mental qualities shaped more by amortised optimisation than direct optimisation. Direct optimisation is solving problems by searching through alternatives on the fly, while amortised optimisation is solving a problem by applying past data and computation. LLMs are products of amortised optimisation, but can be used for direct optimisation.</p><p>Amortised optimisation matters more when growth and change are slow, so the relative importance of it will likely decline with the growth and changed caused by AI. However, a model suggests it's hard for amortised optimisation to entirely decline in relevance.</p><p>Human amortised optimisation is enabled by things like cultural evolution and written knowledge. Advanced AI may weaken human cultural evolution, but may themselves be very good at it due to their low copying costs. AIs might help humans distil insights from written knowledge, but ultimately replace the need for it.</p><p>A benefit of wisdom is avoiding large-scale, or "strategic", errors. Good strategic planning often relies on amortised optimisation. If we need to get strategy right about novel things like AI, we could use AIs to help with simulation. A specific type of strategy error that seems hard to solve with AI is wrong paradigms.</p><h3>BS&#8217;s comments</h3><p>This entry was one of my nominations for a top prize. It persuaded me that the contrast between direct vs. amortised optimization does a lot of work in explaining the contrast between intelligence and wisdom and that the former distinction merits more consideration from a macrostrategic perspective of considering how to make the evolution of AI go well. The essay also contains many points that I found plausible and worthy of further reflection. I wrote some <a href="https://docs.google.com/document/d/1RBT1mv8L-eZeVdeVoNVhiwjn4SQ_6pLfoG_GGqnMgdI/edit?usp=sharing">notes</a> in an attempt to distill ideas that stood out to me in the submitted version, but I would recommend that people read the essay rather than the notes.</p><h3>CN&#8217;s comments</h3><p>Tl;dr: Interesting ideas analysed in a way that makes it easy to trust the author, but somewhat lacking in a coherent, convincing, and important narrative.</p><p><s>I think of this text as having three parts that don&#8217;t quite follow the way the author divided the text. The first one offers a characterisation of wisdom. The second theoretically discusses the future of wisdom, including a growth model and considering cultural evolution. The third one discusses concrete and practical ways in which LLMs might contribute to wisdom.</s></p><p><s>All parts were interesting although I thought they weren&#8217;t very well integrated such that, after reading, it was difficult to remember the contents of the text. I think with a little work, the text might read more like a coherent piece or, failing that, might benefit from being split. I thought the first two sections were clearly stronger than the last section.</s> <em>[Editor&#8217;s note: it has now been split along these lines.]</em></p><p>I would like to highlight that I perceived the author to take an appropriate epistemic stance in this text. At virtually all points in the text when I thought &#8220;what about this objection?&#8221; or &#8220;this doesn&#8217;t seem quite true because of X&#8221;, the text addressed my concern in one of the following paragraphs. Sometimes, the text didn&#8217;t really provide an answer and instead just noted the concern. While that&#8217;s obviously worse than providing an answer, I really appreciated that the author acknowledged potential weaknesses of his conclusions.</p><p>That said, the entry didn&#8217;t offer a big original idea with a &#8220;big if true and plausibly true&#8221; feel (which, to be clear, is a very high bar, and I do think that the text contributes many interesting thoughts). I also have some reservations about this type of highly conceptual, foundational work on philosophy and wisdom since the path to impact is relatively long.</p><p>Overall, I think the text merits a top prize for what I consider its virtues in scholarship and the ideas that it contains.</p><h3>DM&#8217;s comments</h3><p>This is a very thought-provoking set of posts, about which I have several concerns. (The TL;DR is that if you read these, you should also read &#8220;Tentatively against making AIs &#8216;wise&#8217;&#8221; as a counterbalance.) What follows is a summary of my <a href="https://docs.google.com/document/d/1LJy4mMPJUECpY8zupkurBErpYRHGEwQhpXrRxTDQbc4/edit?usp=sharing">notes</a> on the original submission in which these posts formed a single paper, and in which Laine was making a bolder claim about the connection between amortized optimization (AO) and wisdom.&nbsp;</p><p>We are told that <em>a lot of wisdom is amortized optimization</em>. We might wonder: how often do instances of wisdom involve AO, and when they do, how <em>much</em> of the wisdom is attributable to AO? Surely, being produced by AO surely isn&#8217;t <em>sufficient</em> for a process or heuristic to be wise: Laine admits as much.</p><p>I would go further and argue that what distinguishes wise judgments from unwise ones, irrespective of how much AO is involved, has more to do with aspects of reasoning that fall on the <em>other</em> side of Laine&#8217;s conceptual divide. At least, so I would argue if we&#8217;re using &#8220;wise&#8221; in any <em>normative</em> sense, any sense in which it seems obvious that we want to make people and AIs wiser than they are. Indeed, I&#8217;m more confident that legibly good epistemics are crucial for this normative sense of wisdom than I am that there is any deep connection between wisdom and amortized optimization, per se.&nbsp;</p><p>Take Laine&#8217;s example of cultural evolution. Crucially, enacting whatever practices have been handed down to us by our forebears is only <em>sometimes</em> wise; the fact that it involves AO doesn&#8217;t distinguish between cases where it&#8217;s done wisely and those in which it isn&#8217;t. In the best kind of case, the human consciously knows that they are benefiting from many generations of (say) successful arrow-making, and that they are unlikely to do better by designing a new method from the ground up. In the worst kind of case, the human blindly and superstitiously mimics cultural practices they&#8217;ve inherited, leading to harmful cultural inertia.&nbsp;</p><p>In short, the <em>wisest</em> use of cultural evolution will be clear-eyed about <em>why</em> it makes sense to do so-- that is, it will verify the use of AO using legible and deliberate cognition. And often, consciously <em>rejecting</em> a more AO-style process in favor of a more transparent one will be the wisest path.</p><p>(Note also that Laine&#8217;s examples involve cases where a practice/artifact has been optimized for something beneficial, like survival. But often selection processes aren&#8217;t even optimized for anything beneficial to begin with. For example, many memes succeed not through benefiting the host, but because they happen to be good at spreading. I doubt we want to count the use of an AO artifact as wise as long as it <em>just so happens</em> that it&#8217;s beneficial. Adopting the output of an AO process without any reflection about the process that produced it or its current applicability, is often very unwise, even if one gets lucky.)</p><p>These points matter not just for the verbal dispute about what constitutes &#8220;wisdom&#8221;, but for the practical question about what kinds of cognitive features we should imbue our AIs with. For example, Laine characterizes wisdom in part by illegibility; but do we want to create systems whose apparently deep insights people trust despite the lack of any transparent lines of reasoning behind them? As Delaney&#8217;s submission notes, insofar as we characterize &#8220;wisdom&#8221; by mystery/ enigma/ illegibility, we may want to actively avoid building systems that are wise in this sense.</p><h2><a href="https://aiimpacts.org/towards-the-operationalization-of-philosophy-wisdom/">Towards the operationalization of philosophy &amp; wisdom</a> &#8212; Thane Ruthenis &#8212; $6,000</h2><h3>Summary</h3><p>I provide candidate definitions for philosophy and wisdom, relate them to intuitive examples of philosophical and wise reasoning, and offer a tentative formalization of both disciplines. The motivation for this is my belief that their proper operationalization is the bottleneck both to scaling up the work done in these domains (i. e., creating an ecosystem), and to automatizing them.</p><p>I operationalize philosophy as &#8220;the process of deriving novel ontologies&#8221;, further concretized as &#8220;deriving some assumptions using which reality-as-a-whole could be decomposed into domains that could be studied separately&#8221;. I point out the similarity of this definition to John Wentworth&#8217;s operationalization of natural abstractions, from which I build the tentative formal model. In addition, I link philosophical reasoning to conceptual/qualitative/non-paradigmatic research, arguing that they&#8217;re implemented using the same cognitive algorithms. Counterweighting that, I define philosophy-as-a-discipline as a special case of this reasoning, focused on decomposing the &#8220;dataset&#8221; represented by the sum of all of our experiences of the world.</p><p>I operationalize wisdom as meta-level cognitive heuristics that iterate on object-level heuristics for planning/inference, predicting the real-world consequences of a policy which employs said object-level heuristics. I provide a framework of agency in which that is well-specified as &#8220;inversions of inversions of environmental causality&#8221;.</p><p>I close things off with a discussion of whether AIs would be wise/philosophical (arguing yes), and what options my frameworks offer regarding scaling up or automatizing these kinds of reasoning.</p><h3>DM&#8217;s comments</h3><p>There is a great deal that I liked about this paper, though I think it would be better construed as offering &#8220;operationalizations&#8221; not of <em>philosophy</em> and <em>wisdom</em> but of <em>one characteristic thing philosophers (and often other theorists) do</em>, and <em>one characteristic thing wise people do</em>. The two conceptual tasks that are operationalized here--viz., coming up with &#8220;ontologies&#8221; in the Ruthenis&#8217;s sense, and applying good meta-heuristics--are important enough to consider in their own right, even if they overlap only partially with the notions of philosophy and wisdom respectively.&nbsp;</p><p>Among the &#8220;other theorists&#8221; I have in mind are theoretical physicists, who are actively engaged with the question of which ontology best characterizes the domain of physics (e.g. whether strings should be a fundamental part of it); and while there is <em>also</em> a thriving field of the philosophy of physics, <em>that</em> area seems best described as pursuing meta-questions about the relationship between these competing ontologies and the reality behind them, especially in the case where we may have reached an in-principle limit to our ability to empirically distinguish them. In other words, the borderline between physics and philosophy of physics in fact has to do with the presence or absence of in-principle empirical evidence, not with whether there are competing ontologies at issue. Likewise, I would say, for other domains where philosophy interacts with the sciences: for example, the question of how best to carve up cognitive traits most usefully to predict and explain behavior or pathology is well within the ambit of psychology, whereas the philosophy of mind pursues questions even further from the empirical frontier.</p><p>In characterizing the probabilistic independence criterion for ontologies, I wish Ruthenis had explained how this relates to more traditional articulations of theoretical desiderata in the philosophy of science, especially explanatory and predictive power. I was also unsure about how the model is supposed to apply in a case where a given higher level (functional) state (such as being a tree, to use the Ruthenis&#8217;s example) has multiple possible lower-level realizations that are all mutually inconsistent (specifying, among other things, exactly how many leaves are on a tree), since the latter will yield probabilities of zero for P(Li | L \ Li &#8743; H).&nbsp;</p><p>I agree that the notion of meta-heuristics is an important aspect of wisdom and that it&#8217;s importantly distinct from &#8220;outside-view&#8221; reasoning. The discussion of wisdom overlaps somewhat with Laine&#8217;s submission, making the point that wisdom is not explicit knowledge but is stored as &#8220;learned instincts, patterns of behavior&#8230; cultural norms&#8221; that are crystallized through both biological and cultural evolution. I would argue that Ruthenis&#8217;s analysis cuts more finely here, helping to explain why not everything crystalized in this way fits into the &#8220;wisdom&#8221; category (why is a thing like GPT1, that is made of amortized optimization, not actually wise?); and also, conversely, why not everything that is wise has been crystalized in this way. (&#8220;Wisdom can nevertheless be inferred &#8220;manually&#8221;... purely from the domain&#8217;s object-level model, given enough effort and computational resources.&#8221;)</p><h3>CN&#8217;s comments</h3><p>Tl;dr: Thought-provoking, many interesting ideas, but lacking in analysis and scholarship. <em>[Editor&#8217;s note: the final version was edited to address some of these issues, but I don&#8217;t know how much the judges would still be concerned about them.]</em></p><p>I thought the text was very ambitious in a good way and made a lot of points that made me think. For example, the definition for wisdom made me wonder whether it seems right and go through examples and potential counterexamples. So, from that perspective, I really liked the text and appreciate its ideas.</p><p>That said, the quality of writing and analysis could have been improved a lot. Many of the thoughts, while very interesting, felt merely stated instead of argued for. In particular, many claims were confidently stated as conclusions when I didn&#8217;t think they followed from previous arguments. (I also couldn&#8217;t help but notice that virtually all examples were drawn from the LessWrong memesphere, which isn&#8217;t a problem per se, but adds to the sense that the author isn&#8217;t necessarily aware of the gaps in argumentation.) I also would have liked the text to engage more with reasons to doubt its own conclusions (or at least not state them as confidently in places where I thought it was unwarranted).</p><p>I also would have liked it if the text had tried to situate itself more in existing literature as I&#8217;m sure that many of the text&#8217;s ideas have already been discussed elsewhere. The text also is hard to follow at times, such that I could have benefitted from more and clearer explanations. I also have some reservations about this type of highly conceptual, foundational work on philosophy and wisdom since the path to impact is relatively long.</p><p>As we are highlighting this text by giving it a top prize, I would like to share that I wouldn&#8217;t want people to read it without applying a strong critical lens. I think some texts are valuable because they give you a lot of food for thought. Other texts are valuable because they are rigorous, do a lot of the work of critical thinking for you, and make you trust the author&#8217;s conclusion. This text definitely falls into the former and not the latter category.</p><p>With all that said, I overall think the text makes a lot of intellectual contributions, some of which have a &#8220;potentially big if (something in this vicinity is) true and plausibly (something in this vicinity is) true&#8221; feel, and merits a top prize.</p><h3>BS&#8217;s comments</h3><p>I found this to be one of the more thought provoking entries. I liked that it made ambitious positive proposals.</p><p>To echo some of Chi&#8217;s comments: because this entry is receiving a top prize, I also want to encourage readers to approach it as food for thought and with a strong critical lens. To that end, I&#8217;ll mention a few somewhat critical reflections:</p><ul><li><p>As an academic philosopher with only a moderate level of familiarity with Rationalist ideas, I was struck by how many of the interesting ideas in the entry were ones that I had previously encountered in Rationalist writings. I also suspected that I would have a better sense of what is original in the entry if I were more familiar with Rationalist writings and that I would have significantly overestimated the originality of the entry if (like many of my fellow academic philosophers) I were wholly unfamiliar with Rationalist ideas. I offer this mainly as one observation that readers might use to inform their own judgement of the essay rather than as a concern about its philosophical substance.</p></li><li><p>I appreciated that the entry was able to cover a lot of ground in virtue of not belaboring qualifications. But at times I thought it went too far in this direction. As a fairly central example, my impression was that the entry largely theorized about philosophy both without displaying deep familiarity with the history or discipline of philosophy and without sufficiently registering the epistemic risks associated with doing so. More generally, my impression was that this entry achieved an especially high degree of boldness partly through significant expense to epistemic caution. Irrespective of the fruits of that tradeoff in this particular case, I&#8217;d generally prefer that researchers working in this area place more weight on epistemic caution and be averse to making this type of tradeoff.</p></li><li><p>The entry&#8217;s proposed operationalization of philosophy seemed to me to at best capture only a fairly narrow aspect of philosophy. I suspect that the most useful and interesting way to understand the proposal is as targeting something else (perhaps to do with the development of conceptual schemes, as I think another judge suggested in a contest meeting). So, when considering the entry&#8217;s proposal, readers might want to contemplate what the best target for the proposal would be.</p></li></ul><h2>Essays on training wise AI systems &#8212; Chris Leong &#8212; $4,000</h2><ul><li><p><a href="https://aiimpacts.org/an-overview-of-obvious-approaches-to-training-wise-ai-advisors/">An Overview of &#8220;Obvious&#8221; Approaches to Training Wise AI Advisors</a></p></li><li><p><a href="https://aiimpacts.org/some-preliminary-notes-on-the-promise-of-a-wisdom-explosion/">Some Preliminary Notes on the Promise of a Wisdom Explosion</a></p></li></ul><p>These were submitted as part of a single essay, which centrally argued for a particular approach to training wise AI systems. In response to judge feedback it was split into two relatively separate parts, and the emphasis on the imitation learning approach was lessened.</p><h3>Summary</h3><p>The first essay compares the advantages and disadvantages of four &#8220;obvious&#8221; approaches to producing wise AI systems:</p><ul><li><p>Imitation learning: Training imitation learning agents on a bunch of people the lab considers to be wise</p></li><li><p>The Direct Approach: Training an AI to be wise&nbsp;using optimisation&nbsp;based on human demonstrations and feedback</p></li><li><p>The Principled Approach: Attempting to understand what wisdom is at a deep principled level and build an AI that provides advice according to those principles</p></li><li><p>The Scattergun Approach: This approach involves just throwing a bunch of potentially relevant wise principles and/or anecdotes (nuggets of wisdom) from a fixed set at the deciders in the hope that reading through it will lead to a wise decision</p></li></ul><p>The second essay introduces and examines the idea of a &#8220;wisdom explosion&#8221;, in analogy to an intelligence explosion. It argues that it is plausible that a wisdom explosion could be achieved, and that it looks potentially favourable to pursue from a differential technology development perspective.</p><h3>CN&#8217;s comments</h3><p><em>(</em>Tl;dr: Gives an overview of the &#8220;obvious&#8221; approaches to training for wisdom and considerations around them.</p><p>The text aims to convince the reader of the imitation learning approach to training AIs to be wise. To this aim, it contrasts it with other possible &#8220;obvious&#8221; approaches. I found the resulting overview of &#8220;obvious&#8221; approaches and the considerations around them to be the more valuable part of this text compared to the arguments in favour of the imitation learning approach. I can imagine this being a handy resource to look at when thinking about how to train wisdom, both as a starting point, a refresher, and to double-check that one hasn&#8217;t forgotten anything important.</p><p>That said, I thought the analysis of the different proposals was reasonable but didn&#8217;t stand out to me, which also contributes to me not being convinced that the imitation learning approach is in fact the best approach. This would possibly have been better if the entry had been written out: I personally find that bullet points often hide gaps in argumentation that I notice when I try to write them out. (This might also be completely idiosyncratic to me.)</p><p>Overall, I think this text deserves a top prize for doing solid, potentially useful work.</p><h3>BS&#8217;s comments</h3><p>This entry was one of my nominations for a top prize.&nbsp; It looks at a number of different proposals for making AI wise. I thought it did an exceptionally good job of considering a range of relevant proposals, thinking about obstacles to implementation, and making a sustained effort to weigh up pros and cons of rival approaches.&nbsp; Although I think some of the proposals this entry discusses merit further consideration, I&#8217;d especially like to see more work in this area that follows this one in considering a range of rival proposals, obstacles to implementation, and pros and cons.</p><h2><a href="https://thediscontinuity.substack.com/p/should-we-just-be-building-more-datasets">Should we just be building more datasets?</a> &#8212; Gabriel Recchia &#8212; $3,000</h2><h3>Summary</h3><p>Building out datasets and evaluations focused on 'wisdom-relevant skills' could significantly contribute to our ability to make progress on scalable oversight and avoid large-scale errors more broadly. Specifically, I suggest that curating datasets for use as fine-tuning corpora or prompts to elicit latent model capabilities in areas such as error detection, task decomposition, cumulative error recognition, and identification of misleading statements could be a particularly worthwhile project to attempt. These datasets should cover a range of domains to test whether these skills can be elicited in a general way. This kind of work is relatively accessible, not requiring insider access to unpublished frontier models, specialized ML experience, or extensive resources.</p><h3>CN&#8217;s comments</h3><p>Tl;dr: Makes the case for a direction I think is promising, but not a lot of novelty or &#8220;meat&#8221;.</p><p>This is a short, pragmatic and relatively simple submission pointing out that we might just want to do the obvious thing and generate training data on things we consider important for wisdom. I have a lot of sympathy for concise, &#8220;not trying to be fancy&#8221; proposals that, if implemented, could be very impactful. I also think the text did a good job explaining for which domains additional training data is and isn&#8217;t helpful.</p><p>That said, I thought the intellectual contribution of this text could have been bigger. First, the text leaves out the analysis of whether wisdom is a domain for which additional training data is helpful. Second, how successful a potential implementation would come down a lot to the details, i.e. what exactly to put in the training data. While the text offers some preliminary thoughts on this, I think they are not enough to meaningfully guide a project trying to implement the proposal, so any implementers would have to basically think about this from scratch. <em>[Editor&#8217;s note: the final version of the essay does include more ideas along these lines, and also links to a <a href="https://mesotron.github.io/findtheflaws/">practical project</a> building datasets.] </em>This seems particularly relevant given that the basic idea of the proposal isn&#8217;t very original.</p><p>(To be clear, I think it&#8217;s much better to be not original and impactful than original and unlikely to ever be implemented or help. That&#8217;s part of the charme of this submission! But when evaluating this in terms of intellectual contribution, I would have liked it to either be novel or more fleshed out.)</p><p>Overall, I think this text deserves a top prize and I liked it for what I think it is: A brief high-level proposal for a promising if not very original agenda.</p><h3>BS&#8217;s comments</h3><p>This entry identifies some LLM wisdom-relevant skills that could plausibly be improved with suitable datasets using existing methods. I see this entry as belonging to the category of saying sensible, relatively straightforward stuff that&#8217;s worth saying rather than to the category of trying to offer theoretical insights. On the current margin, I think work in this area that aims to contribute via the former category is more to be encouraged than work that aims to contribute via the latter. So I liked that this entry nicely exemplified a category in which I&#8217;d like to see more work. I also liked that it identified a task (building datasets) that could plausibly lead to quality improvements in automated philosophy and which is a task that philosophers are already in a position to help with.</p><h1>Runner-up prizes</h1><h2><a href="https://forum.effectivealtruism.org/posts/z5pKtbjpK3wZWbSmk/designing-artificial-wisdom-the-wise-workflow-research">Designing Artificial Wisdom: The Wise Workflow Research Organization</a> &#8212; Jordan Arel</h2><p>This was judges&#8217; favourite of a series of four relatively standalone essays the author wrote on artificial wisdom. The other three are:</p><ul><li><p><a href="https://forum.effectivealtruism.org/posts/wQmWTkD8pMqKtHYEH/on-artificial-wisdom">On Artificial Wisdom</a></p></li><li><p><a href="https://forum.effectivealtruism.org/s/QLRcxz4k9xxxZikgB/p/k5yF9dYHEP6uj7hgt">Designing Artificial Wisdom: GitWise and AlphaWise</a></p></li><li><p><a href="https://forum.effectivealtruism.org/s/QLRcxz4k9xxxZikgB/p/fyriLzyLmuHuv6jxh">Designing Artificial Wisdom: Decision Forecasting AI &amp; Futarchy</a></p></li></ul><h3>Summary</h3><p>Even simple workflows can greatly enhance the performance of LLM&#8217;s, so artificially wise workflows seem like a promising candidate for greatly increasing Artificial Wisdom (AW).</p><p>This piece outlines the idea of introducing workflows into a research organization which works on various topics related to AI Safety, existential risk &amp; existential security, longtermism, and artificial wisdom. Such an organization could make progressing the field of artificial wisdom one of their primary goals, and as workflows become more powerful they could automate an increasing fraction of work within the organization.</p><p>Essentially, the research organization, whose goal is to increase human wisdom around existential risk, acts as scaffolding on which to bootstrap artificial wisdom.</p><p>Such a system would be unusually interpretable since all reasoning is done in natural language except that of the base model. When the organization develops improved ideas about existential security factors and projects to achieve these factors, they could themselves incubate these projects, or pass them on to incubators to make sure the wisdom does not go to waste.</p><h3>CN&#8217;s comments</h3><p>I like the series for what it shows about the author thinking from ground up and at a high-level about all the relevant aspects of the automation of philosophy and wisdom. I thought the scheme presented here was reasonable although fairly vague and, at its level of abstraction, not very novel.</p><h2><a href="https://www.goodthoughts.blog/p/philosophys-digital-future">Philosophy's Digital Future</a> &#8212; Richard Yetter Chappell</h2><h3>Summary</h3><p>I suggest that a future "PhilAI" may (at a minimum) be well-suited to mapping out the philosophical literature, situating (e.g.) every paper in the PhilPapers database according to its philosophical contributions and citation networks. The resulting "PhilMap" would show us, at a glance, where the main &#8220;fault lines&#8221; lie in a debate, and which objections remain unanswered. This opens up new ways to allocate professional esteem: incentivizing philosophers to plug a genuine gap in the literature (or to generate entirely new branches), and not just whatever they can sneak past referees. I further argue that this could combine well with a "Publish then Filter" model (replacing pre-publication gatekeeping with post-publication review) of academic production.</p><h3>BS&#8217;s comments</h3><p>I liked that this entry made concrete suggestions for using AI to improve information flow across the field of philosophy and that these suggestions were grounded in knowledge of how the field currently operates.&nbsp;</p><p>Regardless of whether one is sold on the particular proposals in this entry, I think its proposals encourage optimism about the availability of options for accelerating philosophical progress, options that can be uncovered without theoretical insights just by reflecting on how to apply existing tools across the field. My sense is that there&#8217;s a lot of room for improvement within academic philosophy when it comes to coordination and experimenting with concrete proposals for, say, improving the publication system. So, I&#8217;d also be particularly excited for future work on using automation to improve the ecosystem of academic philosophy to happen alongside work geared at promoting the requisite sorts of coordination/experimentation for implementing or testing promising proposals.</p><h3>CN&#8217;s comments</h3><p>I thought this presented cool product ideas that would improve Philosophy and that I&#8217;d be excited for someone to pursue. That said, I think it wasn&#8217;t the most relevant to making sure AGI goes well.</p><h2><a href="https://jimmyalfonsolicon.substack.com/p/synthetic-socrates-and-the-philosophers">Synthetic Socrates and the Philosophers of the Future</a> &#8212; Jimmy Alfonso Licon</h2><h3>Summary</h3><p>Many philosophers prize finding deep, important philosophical truths such as the nature of right and wrong, the ability to make free choices, and so on. Perhaps, then, it would be better for such philosophers to outsource the search for such truths to entities that are better equipped to the task: artificial philosophers. This suggestion may appear absurd, initially, until we realize that throughout human history outsourcing tasks has been the norm for thousands of years. To the extent such philosophers care about discovering deep philosophical truths, they have a reason to aid in the creation of artificial philosophers who will eventually, in many respects, do philosophy better than even the best human philosopher who ever lived or who will live.</p><h3>CN&#8217;s comments</h3><p>I thought this was a well-written piece of communication that I&#8217;m happy to have out there and shared with people. That said, I don&#8217;t think it will offer any new considerations to the typical person who knows about our competition.</p><h3>BS&#8217;s comments</h3><p>I take the central idea of this piece to be that philosophers who care about the discovery of deep philosophical truths have reason to aid in the creation of artificial philosophers rather than simply trying to discover such truths. I&#8217;d be delighted for more academic philosophers to engage with this idea.&nbsp;</p><h2><a href="https://www.researchgate.net/publication/384862399_The_Web_of_Belief_A_Research_Program_for_Using_Formal_Ontologies_and_Artificial_Intelligence_to_Represent_Belief_Structures_and_Bolster_Belief_Formation">The Web of Belief</a> &#8212; Paal Fredrik S. Kvarberg</h2><h3>Summary</h3><p>In this essay, I present a method for using technological innovations to improve rational belief formation and wise decision-making in an explainable manner. I assume a view of rationality in which beliefs are evaluated according to norms of intelligibility, accuracy and consistency. These norms can be quantified in terms of logical relations between beliefs. I argue that Bayesian networks are ideal tools for representing beliefs and their logical interconnections, facilitating belief evaluation and revision. Bayesian networks can be daunting for beginners, but new methods and technologies have the potential to make their application feasible for non-experts. AI technologies, in particular, have the potential to support or automate several steps in the construction and updating of Bayesian networks for reasoning in an explainable way. One of these steps consists of relating empirical evidence to theoretical and decision-relevant propositions. The result of using these methods and technologies would be an AI-powered inference engine we can query to see the rational support, empirical or otherwise, of key premises to arguments that bear on important practical decisions. With these technological innovations, decision support systems based on Bayesian networks may represent belief structures and improve our understanding, judgement, and decision-making.</p><h3>CN&#8217;s comments</h3><p>I thought this presented a potentially cool tool that I might be excited to see someone work on. I also really appreciate that the author already implemented proto-types and did experiments with them. Very cool. That said, I am sceptical that this tool will ultimately have a big impact (but would love for reality to convince me otherwise.)</p><h2><a href="https://www.expectedsurprise.com/p/wise-ai-support-for-government-decision">Wise AI support for government decision-making</a> &#8212; Ashwin Acharya and Michaelah Gertz-Billingsley</h2><h3>Summary</h3><p>Governments are behind the curve on adopting advanced AI, in part due to concerns about its reliability. Suppose that developers gradually solve the issues that make AIs poor advisors &#8212; issues like confabulation, systemic bias, misrepresentation of their own thought processes, and limited general reasoning skills. What would make smart AI systems a good or poor fit as government advisors?</p><p>We recommend the development of wise decision support systems that answer the questions people _should_ have asked, not just the ones they did.</p><p>We recommend that government-advising institutions (e.g. contracting orgs like RAND) begin developing such systems, scaling them up over time as they acquire data and AI becomes more capable. Initially, AI systems might mostly summarize and clarify the views of human experts; later, they may guide and reframe conversations; eventually, they may largely _constitute_ the expert conversation that informs government decisionmaking on a topic.</p><h3>CN&#8217;s comments</h3><p>I thought this post highlights a very important area, decision-making in governments, and argues for something that I think is clearly true: Governments should implement (cheap) tools to improve their belief-formation and decision-making. I also liked that they discussed the availability of data. That said, I didn&#8217;t think the text offered either very novel ideas or a concrete enough proposal to make it meaningfully easier for groups to implement the proposal.</p><h2><a href="https://aiimpacts.org/machines-and-moral-judgment/">Machines and moral judgement</a> &#8212; Jacob Sparks</h2><h3>Summary</h3><p>Creating AI that is both generally intelligent and good requires building machines capable of making moral judgments, a task that presents philosophical and technical difficulties.</p><p>Current AI systems and much of what people imagine when they talk about &#8220;moral machines&#8221; may make morally significant or morally correct decisions, but they lack true moral reasoning.</p><p>Moral reasoning allows for reflective distance from one&#8217;s motivations and from any description under which one might act. It seems to challenge the distinction between beliefs and desires and to sit uncomfortably with a naturalistic worldview.</p><p>While there are promising tools in reinforcement learning for building agents that make moral judgments, we need more work on the philosophical puzzles raised by moral judgment and to rethink core RL concepts like action, reward, value and uncertainty.</p><h3>CN&#8217;s comments</h3><p>I thought the discussion of the peculiarities of morality and hence potential challenges with implementing it in AI systems was well-done if not novel. That said, I really strongly disagree with its last section &#8220;But How&#8221; where the author discusses how particular AI training techniques could help and thought that was by far the weakest part of the essay. In particular, I thought the connection the author draws between uncertainty and moral reflection is dubious and disagree with the reasons to conclude that RL is more promising than supervised finetuning. I think the text could have been improved with simple changes to the framing. (E.g. &#8220;Supervised finetuning has these flaws. Perhaps RL is better? Here are some reasons to think so. But, unfortunately, RL also has some flaws. So, maybe not.&#8221;) <em>[Editor&#8217;s note: The framing was adjusted in response to this feedback.]</em></p><h2><a href="https://aiimpacts.org/the-purpose-of-philosophical-ai-will-be-to-orient-ourselves-in-thinking/">The purpose of philosophical AI will be: to orient ourselves in thinking</a> &#8212; Maximilian Noichl</h2><h3>Summary</h3><p>In this essay I will suggest a lower bound for the impact that artificial intelligence systems can have on the automation of philosophy. Specifically I will argue that while there is reasonable skepticism about whether LLM-based-systems that are sufficiently similar to the best ones available right now, will, in the short to medium term, be able to independently produce philosophy in a level of quality and creativity makes them interesting to us, they are clearly able to solve medium complexity language tasks in a way that makes them useful to structure and integrate the contemporary philosophical landscape, allowing for novel and interesting ways to orient ourselves in thinking.</p><h3>CN&#8217;s comments</h3><p>I really liked that this entry just went ahead and actually tried to implement a proto-type for a tool that can help with automating Philosophy. I also think the type of tool they proto-typed could become really useful. (In a way, it&#8217;s a first implementation of the improvements to organising academic literature that another entry proposed.)</p><h3>BS&#8217;s comments</h3><p>In a pilot study, the author of this entry used automation tools to map some literature on the philosophy of AI. I was happy to see an actual instance of automating something philosophical, that the instance was of a sort that could plausibly be useful if well-implemented at scale, and that the instance could be seen as a sort of proof of concept of a proposal in another entry (&#8220;Philosophy's Digital Future&#8221;) to use AI to map philosophical literature.</p><h2><a href="http://aiimpacts.org/wp-content/uploads/2024/10/Popper_vonStengel_Cross_context_deduction-1.pdf">Cross-context deduction: on the capability necessary for LLM-philosophers</a> &#8212; Rio Popper and Clem von Stengel</h2><h3>Summary</h3><p>In this paper, we define a capability we call &#8216;cross-context deduction&#8217;, and we argue that cross-context deduction is required for large language models (LLMs) to be able to do philosophy well. First, we parse out several related conceptions of inferential reasoning, including cross-context deduction. Then, we argue that cross-context deduction is likely to be the most difficult of these reasoning capabilities for language models and that it is particularly useful in philosophy. Finally, we suggest benchmarks to evaluate cross-context deduction in LLMs and possible training regimes that might improve performance on tasks involving cross-context deduction. Overall, this paper takes an initial step towards discerning the best strategy to scalably employ LLMs to do philosophy.</p><h3>CN&#8217;s comments</h3><p>I really enjoyed the entry from an ML perspective and also thought it was very easy to read. I also really like the general approach of &#8220;Take some fairly well defined types of capabilities that we can differentiate in the context of ML, think about which ones are the kinds of capabilities we <em>want</em> to differentially improve, and suggest promising ways of doing so based on the existing ML literature.&#8221; That said, I&#8217;m unfortunately very unsure that boosting what the authors call cross-context deduction* would in fact be a good idea. While I thought their discussion of cross-context deduction* and Philosophy was really interesting, my current guess is that if we succeeded at boosting cross-context deduction* this would be net bad, i.e. net would net advance dangerous capabilities more than is justified by the boost to Philosophy.</p><p>(*We had some discussion about whether they defined deduction correctly, but I think they successfully managed to point at a for practical purposes distinct-enough kind of capability.)</p><h3>DM&#8217;s comments</h3><p>I have three main concerns about this paper.&nbsp;</p><p>First, it&#8217;s unclear what the authors mean by &#8220;deduction&#8221;: they introduce it <em>both</em> as reasoning from universals to particulars (e.g. on pgs 2 and 6), <em>and</em>, more commonly, as reasoning in which premises &#8220;necessarily imply&#8221; the conclusion. But these are very different things. &#8220;Socrates is human, therefore something is human&#8221; is a deductive inference (in the latter sense) that moves from particular to universal. Conversely, we infer from universal to particular in an <em>uncertain</em> way all the time, eg. applying statistics to a case.&nbsp;</p><p>In what follows, I&#8217;ll assume that by &#8220;deduction&#8221; the authors really mean &#8220;the type of reasoning that preserves truth with certainty&#8221;. (Another alternative is that they mean &#8220;inference by way of the application of logical laws&#8221;, but in that case the category depends on the choice of a logical system.) But why think of this as a <em>type of reasoning</em>, beyond the fact that there&#8217;s an old tradition of doing so? Maybe instead we should think of this as a specific epistemic feature that happens to attend some instances of various types of reasoning, individuated by something like an underlying cognitive procedure. That is, perhaps a better taxonomy will distinguish between semantic reasoning, statistical reasoning, visual reasoning, conditionalization, etc., <em>each</em> of which allows for various degrees of epistemic distance between premises and conclusion, <em>including</em>--in the limit case--the preservation of certainty.&nbsp;</p><p>My second concern is that no convincing case is made that there is anything especially <em>deductive</em> about philosophical reasoning, or that there is anything especially <em>philosophical</em> about deductive reasoning (cross-context or otherwise), in <em>either</em> of the senses of &#8220;deduction&#8221; conflated in the paper. Indeed, paradigmatically philosophical arguments can usually be presented either in a deductive form or an inductive one. The deductive version will have more tendentious premises of which we are uncertain; the inductive version will have more certain premises but won&#8217;t entail the conclusion.&nbsp;</p><p>Third, the paper uses &#8220;context&#8221; in &#8220;cross-context&#8221; in two very different ways: applied to humans it seems to mean roughly a domain of inquiry or a source of information, but applied to LLMs it seems to literally mean their &#8220;context window&#8221; (the input text supplied by the user). But there is no clear relationship between these two things aside from the fact that the word &#8220;context&#8221; is used for both of them. So even if philosophical reasoning by humans is often &#8220;cross-context&#8221; in the former sense, it doesn&#8217;t follow that there&#8217;s any special relationship to &#8220;cross-context&#8221; reasoning by LLMs in the latter sense.&nbsp;&nbsp;</p><h2><a href="http://aiimpacts.org/wp-content/uploads/2024/10/Essay-competition-prizewinner_-Evolutionary-perspectives-on-AI-values.pdf">Evolutionary perspectives on AI values</a> &#8212; Maria Avramidou</h2><h3>Summary</h3><p>Addressing the question of value drift is crucial for deciding the extent to which AI should be automated. AI has the potential to develop capabilities that, if misaligned with human values, could present significant risks to humanity. The decision to automate AI depends heavily on our ability to trust that these systems will reliably maintain this alignment. Evolutionary theory provides a valuable framework for predicting how AI values might change, highlighting the role of environmental incentives and selection pressures in shaping AI behavior. Current discussions often focus on the inevitability of AI developing selfish values due to these pressures. I analyse these arguments and provide a case against these claims by drawing examples from evolution.</p><h3>CN&#8217;s comments</h3><p>This text, while somewhat off-topic in my view, highlights value drift as an important and neglected area. I think that&#8217;s valuable and it does a good job characterising it alongside what they call value specification and value monitoring (which I would loosely translate into outer and inner alignment.) I also think it&#8217;s reasonable to discuss evolution in this context. That said, I thought the text missed an opportunity to really analyse to which extent the evolutionary analogy is relevant in the AI context (and perhaps different stages in the AI context, e.g. training and post-deployment) and what other mechanics might be at play. The text failed to convince me that the evolutionary arguments it gave have bearing on AI and, in my view, unfortunately also doesn&#8217;t seem to try to do so.</p><h3>BS&#8217;s comments</h3><p>(I recused myself from judging this entry because I had worked with the author on related work.)</p><h2><a href="https://forum.effectivealtruism.org/posts/tuAnrhy2BExGkwTKE/tentatively-against-making-ais-wise">Tentatively against making AIs 'wise'</a> &#8212; Oscar Delaney</h2><h3>Summary</h3><ul><li><p>Wisdom is best conceived of as being more intuitive than carefully reasoned. This is helpful in order to distinguish &#8216;wisdom&#8217; from &#8216;rationality&#8217; or &#8216;good thinking&#8217;.</p></li><li><p>Intuitions, including wise intuitions, are easy to communicate but hard to justify, and need to be taken &#8216;on faith&#8217;.</p></li><li><p>It is very important that early AGIs are transparent and that their justifications for any actions they propose can be checked by a human.</p></li><li><p>Therefore, we should prefer to train smart, careful-reasoning AIs rather than inscrutable wisdom-nugget dispensing AIs.</p></li><li><p>Arguably I am unreasonably shifting the goalposts of the essay competition. The more positive framing is that I am &#8220;noticing that an old ontology was baking in some problematic assumptions about what was going on&#8221; and therefore I am actually being wise!</p></li></ul><h3>CN&#8217;s comments</h3><p>I liked that the entry questions the premise of the competition. I also think it makes a reasonable point that inspires some thinking and in a concise and clear way. A more complete version of this text would have, for example, tried to reason about what it is about wisdom that the competition hosts find appealing and whether there is some wisdom-related property that would be beneficial to promote in AI or otherwise addressed a potential trade-off between the author&#8217;s concern and wise AI. (Or otherwise tried to turn the competition premise into its most interesting version.)</p><h1>Concluding thoughts</h1><p>We found it fascinating and informative getting to read the competition entries and then hear other people&#8217;s perspectives on them. We hope that many readers may find the same. The greatest success of this competition will be if it sparks further thought and engagement, and helps these nascent fields to find their feet a little earlier than they might otherwise have done.</p><p>Here are some closing thoughts from our judges:</p><h3>BS&#8217;s thoughts</h3><p>Views I had before reading contest entries:</p><ul><li><p>I thought it&#8217;d probably be extremely difficult to make big-picture theoretical insights about how to automate philosophy.</p></li><li><p>I was mildly optimistic about the prospects for narrower theoretical insights about automating philosophy and about non-theoretical approaches to advancing the automation of philosophy in desirable ways.</p></li><li><p>I was somewhat skeptical about the fruitfulness of automating wisdom as a research direction.</p></li></ul><p>How reading contest entries affected my views:</p><ul><li><p>Reading entries reinforced my confidence in the extreme difficulty of making big-picture theoretical insights about how to automate philosophy.</p></li><li><p>Reading entries made me more optimistic about the prospects for narrower theoretical insights about automating philosophy and about non-theoretical approaches to advancing the automation of philosophy in desirable ways.</p></li><li><p>Reading entries made me think that there&#8217;s more fruitful theoretical work to be done on wisdom than I&#8217;d expected and that the automation of wisdom is a research avenue at least worth pursuing more on the current margin. (Currently, I take there to be very little work on this.)</p></li></ul><p>Some takes on future work in these areas:</p><ul><li><p>I&#8217;d be excited for senior philosophers who have worked in multiple areas and on philosophical methodology to do big-picture, foundational work on automating philosophy in collaboration with AI researchers. (Prominent philosophers who fit the sort of profile I have in mind include David Chalmers, Alan H&#225;jek, and Timothy Williamson.) In contrast, I&#8217;d be more excited about non-philosophers and early-career philosophers instead working on aspects of the topic that are narrower and/or more applied.</p></li><li><p>I think there&#8217;s a lot of progress to be made in automating philosophy in desirable ways that doesn&#8217;t require theoretical breakthroughs. So I think it&#8217;d be good for more people to do things like:</p><ul><li><p>find ways to facilitate philosophical reasoning with existing tools of automation,</p></li><li><p>develop datasets and tests for philosophical reasoning in AI systems,</p></li><li><p>find ways to use automation to improve information flow throughout the field of academic philosophy,&nbsp;</p></li><li><p>empirical work on existing AI systems that tests hypotheses about those systems&#8217; philosophical reasoning abilities, and</p></li><li><p>empirical work on existing AI systems that provides compelling, toy illustrations philosophical failure modes that could lead to catastrophes in powerful, future AI systems.</p></li></ul></li><li><p>I&#8217;d&nbsp; like to see work in the genre of Leong&#8217;s <em>An Overview of &#8220;Obvious&#8221; Approaches to Training Wise AI Advisors</em>: that is, work that lays out some approaches to automating wisdom along with their pros and cons. I&#8217;d also like to see some work that implements particular approaches to automating wisdom. I&#8217;m not confident that such implementations would be useful, but I think trying them would at least provide evidence about the extent to which this direction is worth pursuing.</p></li></ul><h3>DM&#8217;s thoughts</h3><p>Reading and thinking about these submissions, it strikes me that there are two kinds of goals for which one might want to try to automate philosophy and/or wisdom. One is the goal of helping to improve <em>human</em> thinking: perhaps AIs can guide us to understand our values better, help us achieve new philosophical insights, or help us reason better about how to build better institutions or future AIs. Importantly, a system that supports this goal wouldn&#8217;t need to be wise or philosophical itself: consider, for example, the AI that Yetter Chapell envisions, which simply provides a map of philosophical contributions, or the systems described by Acharya and Gertz-Billingsley, which support government decision-making. I was impressed by the variety and practicality of the suggestions for making progress towards the goal of improving human reasoning.&nbsp;</p><p>A very different goal is that of imbuing the AIs <em>themselves</em> with wisdom, and/or having them arrive at philosophical conclusions on our behalf. Having read these submissions, I am both more optimistic that this can be achieved, and hesitant about whether we want to achieve it. Consider, for example, Delaney&#8217;s point that many features we associate with wisdom are really not the sort of features with which we&#8217;d want to imbue AIs, at least at first. (Laine likewise places &#8220;illegibility&#8221; on the &#8220;wisdom&#8221; side of the &#8220;intelligence/wisdom&#8221; divide.) So perhaps <em>wisdom</em> isn&#8217;t the right concept to focus on, after all. Likewise, if an AI is trained to be very capable at philosophical reasoning, we will probably want to inspect its whole chain of reasoning rather than trust its philosophical hunches. And the more legible we can make such a system, the more its output will actually function simultaneously to improve human reasoning.&nbsp;</p><p>As much as I would love to see more progress made in philosophy generally, I do think the most pressing problem for which we need the help of AI is the problem of alignment itself. Here my guess is that systems aimed at supporting human reasoning could be quite helpful; but for the reasons just given, I&#8217;m less optimistic that it would be very useful to aim making AIs &#8220;wise&#8221; or even good at philosophy <em>per se</em>, as a stepping stone towards alignment. One kind of exception might be systems trained to reason carefully (but also legibly!) about human values, in a way that might ultimately help us design systems to reflect them.&nbsp;</p><h3>AS&#8217;s thoughts</h3><p>My reaction after reading the submissions: "Wow, this area really needs a lot of work". The gap between what's there and what seems possible feels large.</p><p>Ideally work in this area:</p><ul><li><p>Advances our understanding of the relevant concepts and builds up new ones as needed</p></li><li><p>Follows good epistemic practices and sound reasoning</p></li><li><p>Is concrete and connected to experiments, observations, or implementations</p></li></ul><p>I'd say the essays have about 1-1.5 of these three features on average (and I'm grateful that they do). They don't all have the same such features, so this doesn't seem to be an intrinsic limit of the domain.</p><p>I&#8217;m pretty sure you could take any one of the essays, score it along the three dimensions, and then develop it further along the dimensions where it's lacking.</p><p>This means that there is already one clear recipe for further work--significantly improve what's there along one dimension, turning it into something quite different in the process. Beyond this basic recipe, there's comparing, combining, and systematizing ideas across different essays. And finally, <a href="https://blog.aiimpacts.org/p/essay-competition-on-the-automation">the original competition announcement</a> poses excellent questions that none of the essays seem to directly address, especially in the Thinking ahead and Ecosystems sections.</p><p>As part of my work on Elicit, I regularly ask myself "What kind of reasoning is most worth advancing, and how?", and I know other founders whose roadmaps are open to arguments &amp; evidence. Of course, good thinking on automating wisdom would likely have many diffuse positive effects, but I wanted to flag this one path to impact that feels especially real to me, and makes me personally very excited to see more work in this area.</p><h3>Acknowledgements</h3><p>Thanks to AI Impacts for hosting this competition, and to the donors who allowed it to happen. And of course, a big thanks to the people who took up the challenge and wrote entries for the competition &#8212; this would all have been nothing without them.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>After in-depth judge discussions, there was more consensus on which were the top four entries, than on the proper ordering between those. We therefore determined to split a $20,000 pool for top prizes among these four, according to average judge views.</p><p>As there were a good number of entries that judges felt could be deserving of prizes, the remaining prize money was split into $500 runner-up prizes, to allow for more total prizes.</p><p>Although the competition announcement specified &#8220;category&#8221; prizes, in practice the categories did not seem very natural &#8212; many entries lay in multiple categories, and for many entries different people would make different assessments about which categories they lay in. As all of the categories were represented in the top four, we decided to move away the idea of category prizes.</p><p>The principles that were important to us in reworking the prize schedule were:</p><ul><li><p>To reflect judge sentiment</p></li><li><p>To have the same total amount of prize-money</p></li><li><p>To keep the eventual prize amounts reasonably straightforward</p></li><li><p>To make things broadly similar in how evenly the money was distributed</p></li><li><p>Not to break any commitments that people might reasonably have counted on (like &#8220;if I have the best entry in category X, I&#8217;ll get at least $2,000&#8221;)</p></li></ul><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[What happens if you present 500 people with an argument that AI is risky?]]></title><description><![CDATA[Recently, Nathan Young and I wrote about arguments for AI risk and put them on the AI Impacts wiki. In the process, we ran a casual little survey of the American public regarding how they feel about the arguments, initially (if I recall) just because we were curious whether the arguments we found least compelling would also fail to compel a wide variety of people.]]></description><link>https://blog.aiimpacts.org/p/what-happens-if-you-present-500-people</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/what-happens-if-you-present-500-people</guid><dc:creator><![CDATA[Katja Grace]]></dc:creator><pubDate>Wed, 04 Sep 2024 16:35:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Recently, Nathan Young and I wrote about arguments for AI risk and put them on the <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/start">AI Impacts wiki</a>. In the process, we ran a casual little survey of the American public regarding how they feel about the arguments, initially (if I recall) just because we were curious whether the arguments we found least compelling would also fail to compel a wide variety of people.&nbsp;</p><p>The results were very confusing, so we ended up thinking more about this than initially intended and running four iterations total. This is still a small and scrappy poll to satisfy our own understanding, and doesn&#8217;t involve careful analysis or error checking. But I&#8217;d like to share a few interesting things we found. Perhaps someone else wants to look at our data more carefully, or run more careful surveys about parts of it.&nbsp;</p><p>In total we surveyed around 570 people across 4 different polls, with 500 in the main one. The basic structure was:</p><ol><li><p>p(doom): &#8220;If humanity develops very advanced AI technology, how likely do you think it is that this causes humanity to go extinct or be substantially disempowered?&#8221; Responses had to be given in a text box, a slider, or with buttons showing ranges</p></li><li><p>(Present them with one of eleven arguments, one a &#8216;control&#8217;)</p></li><li><p>&#8220;Do you understand this argument?&#8221;</p></li><li><p>&#8220;What did you think of this argument?&#8221;</p></li><li><p>&#8220;How compelling did you find this argument, on a scale of 1-5?&#8221;</p></li><li><p>p(doom) again</p></li><li><p>Do you have any further thoughts about this that you'd like to share?</p></li></ol><p>Interesting things:</p><ul><li><p>In the first survey, participants were much more likely to move their probabilities downward than upward, often while saying they found the argument fairly compelling. This is a big part of what initially confused us. We now think this is because each argument had counterarguments listed under it. Evidence in support of this: in the second and fourth rounds we cut the counterarguments and probabilities went overall upward. When included, three times as many participants moved their probabilities downward as upward (21 vs 7, with 12 unmoved).&nbsp;</p></li><li><p>In the big round (without counterarguments), <strong>arguments pushed people upward slightly more</strong>: 20% move upward and 15% move downward overall (and 65% say the same). On average, p(doom) increased by about 1.3% (for non-control arguments, treating button inputs as something like the geometric mean of their ranges).</p></li><li><p>But the <strong>input type seemed to make a big difference to how people moved</strong>! &nbsp; </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AUmC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AUmC!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 424w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 848w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 1272w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AUmC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png" width="1192" height="736" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:736,&quot;width&quot;:1192,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AUmC!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 424w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 848w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 1272w, https://substackcdn.com/image/fetch/$s_!AUmC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f7e353c-db08-435b-91fa-593948a9fd45_1192x736.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It makes sense to me that people move a lot more in both directions with a slider, because it&#8217;s hard to hit the same number again if you don&#8217;t remember it. It&#8217;s surprising to me that they moved with similar frequency with buttons and open response, because the buttons covered relatively chunky ranges (e.g. 5-25%) so need larger shifts to be caught.</p></li><li><p><strong>Input type also made a big difference to the probabilities people gave to doom</strong> before seeing any arguments. People seem to give substantially lower answers when presented with buttons (Nathan proposes this is because there was was a &lt;1% and 1-5% button, so it made lower probabilities more salient/ &#8220;socially acceptable&#8221;, and I agree):</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Y7XO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Y7XO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 424w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 848w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 1272w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Y7XO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png" width="764" height="472" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:472,&quot;width&quot;:764,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:&quot;Chart&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="Chart" srcset="https://substackcdn.com/image/fetch/$s_!Y7XO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 424w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 848w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 1272w, https://substackcdn.com/image/fetch/$s_!Y7XO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd65fd55a-fac7-4662-8650-7a79b1bd1aed_764x472.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></li><li><p>Overall, P(doom) numbers were fairly high: 24% average, 11% median. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!o0DF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!o0DF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 424w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 848w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 1272w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!o0DF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png" width="1456" height="953" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:953,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!o0DF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 424w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 848w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 1272w, https://substackcdn.com/image/fetch/$s_!o0DF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab68b873-310e-4642-8430-59c9b19b5cb5_1600x1047.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></li><li><p>We added a &#8216;control argument&#8217;. We presented this as &#8220;Here is an argument that advanced AI technology might threaten humanity:&#8221; like the others, but it just argued that AI might substantially contribute to music production:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!D4uV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!D4uV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 424w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 848w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 1272w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!D4uV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png" width="1456" height="573" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:573,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!D4uV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 424w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 848w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 1272w, https://substackcdn.com/image/fetch/$s_!D4uV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12f7fd1b-a90e-4d4d-b95f-c52ca1012809_1600x630.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><br> This was the third worst argument in terms of prompting upward probability motion, but the third best in terms of being &#8220;compelling&#8221;. Overall it looked a lot like other arguments, so that&#8217;s a bit of a blow to the model where e.g. we can communicate somewhat adequately,&nbsp; &#8216;arguments&#8217; are more compelling than random noise, and this can be recognized by the public.</p></li><li><p>In general, combinations of claims about compellingness, changes in probabilities and write-in answers were frequently hard to make sense of, especially if you treat the probability changes as meaningful rather than as random noise. For instance, the participant who rated an argument 4/5 compellingness, yet reduced their P(doom) from 26% to 0%, and said &#8220;The above argument is completely based on probability of AI's effects on humanity near future, some feels that it could be turn into negative way but most people feels that it is going to a good aspect for future technology.&#8221; My sense is that this was more true in the first round than the fourth, so perhaps the counterarguments are doing something there.</p></li><li><p>This is how the different arguments fared:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GSeU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GSeU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 424w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 848w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 1272w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GSeU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png" width="1434" height="888" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:888,&quot;width&quot;:1434,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GSeU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 424w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 848w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 1272w, https://substackcdn.com/image/fetch/$s_!GSeU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F623134d7-b5eb-43c2-ba06-279c9918931c_1434x888.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></li><li><p>All arguments were on mean between 2 and 3 compelling, including the control argument</p></li><li><p>The arguments we considered worst did roughly worst, in terms of probability change (black&nbsp; boxes, large impacts, multiagent dynamics, control)</p></li><li><p>The argument from expert opinion was the very worst though, which is interesting to me because it seems like the one that people are constantly pointing at in public, in trying to justify concerns about x-risk.&nbsp;</p></li><li><p>The top arguments for increasing P(doom) were the ones about normal human processes getting out of hand (human non-alignment, catastrophic tools, speed) then the ones about bad new agents came below (second species, competent non-aligned agents, inferiority). Compellingness looks related but not closely so.&nbsp;</p></li><li><p>We both found the experience of quickly polling the public enlivening.</p></li></ul><p>If you wish to look at the arguments in more detail, they are <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/start">here</a>. If you want to analyze the data yourself, or read everyone&#8217;s write-in responses, it&#8217;s <a href="https://docs.google.com/spreadsheets/d/1XWVg0l9BzWh_TX6ZxhIbD__Mdz1yF-IRIiGs_vnekOU/edit?usp=sharing">here</a>. If you see any errors, please let us know.&nbsp;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support our work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Ten arguments that AI is an existential risk]]></title><description><![CDATA[and polls on which are the most compelling]]></description><link>https://blog.aiimpacts.org/p/ten-arguments-that-ai-is-an-existential</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/ten-arguments-that-ai-is-an-existential</guid><dc:creator><![CDATA[Katja Grace]]></dc:creator><pubDate>Tue, 13 Aug 2024 16:54:01 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is a snapshot of a <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/start#large_impacts">new page</a> on the <a href="https://wiki.aiimpacts.org/start">AI Impacts Wiki</a>.</em></p><p>We&#8217;ve made a <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/start">list of arguments</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> that AI poses an existential risk to humanity. We&#8217;d love to hear how you feel about them in the comments and polls.&nbsp;</p><h2>Competent non-aligned agents</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9JiG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9JiG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9JiG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png" width="382" height="382" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d419d843-3790-469c-b119-05b21033ab8e_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9JiG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9JiG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd419d843-3790-469c-b119-05b21033ab8e_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Humans increasingly lose games to the best AI systems. If AI systems become similarly adept at navigating the real world, will humans also lose out? (Image: Midjourney)</figcaption></figure></div><p>Summary:</p><ol><li><p>Humans will build AI systems that are 'agents', i.e. they will autonomously pursue goals</p></li><li><p>Humans won&#8217;t figure out how to make systems with goals that are compatible with human welfare and realizing human values</p></li><li><p>Such systems will be built or selected to be highly competent, and so gain the power to achieve their goals</p></li><li><p>Thus the future will be primarily controlled by AIs, who will direct it in ways that are at odds with long-run human welfare or the realization of human values</p></li></ol><p>Selected counterarguments:</p><ul><li><p>It is unclear that AI will tend to have goals that are bad for humans</p></li><li><p>There are many forms of power. It is unclear that a competence advantage will ultimately trump all others in time</p></li><li><p>This argument also appears to apply to human groups such as corporations, so we need an explanation of why those are not an existential risk</p></li></ul><p>People who have favorably discussed<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_competent_non-aligned_agents/start#discussion_of_this_argument_elsewhere">here</a>): Paul Christiano (<a href="https://axrp.net/episode/2021/12/02/episode-12-ai-xrisk-paul-christiano.html">2021</a>), Ajeya Cotra (<a href="https://www.planned-obsolescence.org/what-were-doing-here/">2023</a>), Eliezer Yudkowsky (<a href="https://x.com/ESYudkowsky/status/1819532576115372384">2024</a>), Nick Bostrom (2014<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_competent_non-aligned_agents/start">Full wiki page on the competent non-aligned agents argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202494}" data-component-name="PollToDOM"></div><h2>Second species argument</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6UBW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6UBW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 424w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 848w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 1272w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6UBW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png" width="382" height="319.2953296703297" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1217,&quot;width&quot;:1456,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6UBW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 424w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 848w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 1272w, https://substackcdn.com/image/fetch/$s_!6UBW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3845a43b-baab-4a36-b04f-4d65b6379f8f_1600x1337.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">An orangutan uses a stick to control juice, while humans use complex systems of tools, structures, and behavioral coordination to control the orangutan. Should orangutans have felt safe inventing humans, if they had had the choice? (Image: <a href="https://commons.wikimedia.org/wiki/File:Orangutan_using_precision_grip.jpg">William H. Calvin</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>Human dominance over other animal species is primarily due to humans having superior cognitive and coordination abilities</p></li><li><p>Therefore if another 'species' appears with abilities superior to those of humans, that species will become dominant over humans in the same way</p></li><li><p>AI will essentially be a 'species' with superior abilities to humans</p></li><li><p>Therefore AI will dominate humans</p></li></ol><p>Selected counterarguments:</p><ul><li><p>Human dominance over other species is plausibly not due to the cognitive abilities of individual humans, but rather because of human ability to communicate and store information through culture and artifacts</p></li><li><p>Intelligence in animals doesn't appear to generally relate to dominance. For instance, elephants are much more intelligent than beetles, and it is not clear that elephants have dominated beetles</p></li><li><p>Differences in capabilities don't necessarily lead to extinction. In the modern world, more powerful countries arguably control less powerful countries, but they do not wipe them out and most colonized countries have eventually gained independence</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/second_species_argument_for_ai_xrisk#discussion_elsewhere">here</a>): Joe Carlsmith (<a href="https://joecarlsmith.com/2024/01/02/gentleness-and-the-artificial-other#when-species-meet">2024</a>), Richard Ngo (<a href="https://www.alignmentforum.org/posts/8xRSjC76HasLnMGSf/agi-safety-from-first-principles-introduction">2020</a>), Stuart Russell (2020<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a>), Nick Bostrom (<a href="https://www.ted.com/talks/nick_bostrom_what_happens_when_our_computers_get_smarter_than_we_are/transcript?subtitle=en">2015</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/second_species_argument_for_ai_xrisk">Full wiki page on the second species argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202660}" data-component-name="PollToDOM"></div><p></p><h2>Loss of control via inferiority</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!UnU_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!UnU_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 424w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 848w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 1272w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!UnU_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png" width="382" height="354" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:354,&quot;width&quot;:382,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!UnU_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 424w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 848w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 1272w, https://substackcdn.com/image/fetch/$s_!UnU_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3097b4c-0883-464e-bc10-d3fc04c738ab_382x354.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The coronation of Henry VI. It can be hard for a child monarch to act in their own interests, even with official power, because they are so much less competent than their advisors. Humans surrounded by advanced AI systems may be in an analogous situation. (Image: <a href="https://catalogue.etoncollege.com/object-fda-e-2166-2015">Opie, John (R. A.), 1797</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>AI systems will become much more competent than humans at decision-making</p></li><li><p>Thus most decisions will probably be allocated to AI systems</p></li><li><p>If AI systems make most decisions, humans will lose control of the future</p></li><li><p>If humans have no control of the future, the future will probably be bad for humans</p></li></ol><p>Selected counterarguments:</p><ul><li><p>Humans do not generally seem to become disempowered by possession of software that is far superior to them, even if it makes many 'decisions' in the process of carrying out their will</p></li><li><p>In the same way that humans avoid being overpowered by companies, even though companies are more competent than individual humans, humans can track AI trustworthiness and have AI systems compete for them as users. This might substantially mitigate untrustworthy AI behavior</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_loss_of_control_through_inferiority">here</a>): Paul Christiano (<a href="https://paulfchristiano.medium.com/three-impacts-of-machine-intelligence-6285c8d85376">2014</a>), Ajeya Cotra (<a href="https://80000hours.org/podcast/episodes/ajeya-cotra-accidentally-teaching-ai-to-deceive-us/?t=9057#the-orphan-heir-with-a-trillion-dollar-fortune-005914">2023</a>), Richard Ngo (<a href="https://arxiv.org/pdf/2209.00626">2024</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_loss_of_control_through_inferiority">Full wiki page on loss of control via inferiority</a>&nbsp;</p><div class="poll-embed" data-attrs="{&quot;id&quot;:202661}" data-component-name="PollToDOM"></div><p></p><h2>Loss of control via speed</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jmby!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jmby!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 424w, https://substackcdn.com/image/fetch/$s_!jmby!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 848w, https://substackcdn.com/image/fetch/$s_!jmby!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 1272w, https://substackcdn.com/image/fetch/$s_!jmby!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jmby!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png" width="382" height="394.1501272264631" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:811,&quot;width&quot;:786,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jmby!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 424w, https://substackcdn.com/image/fetch/$s_!jmby!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 848w, https://substackcdn.com/image/fetch/$s_!jmby!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 1272w, https://substackcdn.com/image/fetch/$s_!jmby!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1055d5a3-7a74-4077-a1d2-c00a439b41b4_786x811.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Tetris is a game that speeds up over time. As the time for a player to react grows shorter, the player's moves become worse, until the player loses. If advanced AI causes events to speed up, human responses might similarly become decreasingly appropriate, potentially until humans lose all relevant control. (Image: <a href="https://commons.wikimedia.org/wiki/File:TetrisJS-GameOver.png">Cezary Tomczak, Maxime Lorant</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>Advances in AI will produce very rapid changes, in available AI technology, other technologies, and society</p></li><li><p>Faster changes reduce the ability for humans to exert meaningful control over events, because they need time to make non-random choices</p></li><li><p>The pace of relevant events could become so fast as to allow for negligible relevant human choice</p></li><li><p>If humans are not ongoingly involved in choosing the future, the future is likely to be bad by human lights</p></li></ol><p>Selected counterarguments:</p><ul><li><p>The pace at which humans can participate is not fixed. AI technologies will likely speed up processes for human participation.</p></li><li><p>It is not clear that advances in AI will produce very rapid changes.</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_loss_of_control_through_speed#discussion_of_this_argument_elsewhere">here</a>): Joe Carlsmith (<a href="https://arxiv.org/pdf/2206.13353">2021</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_loss_of_control_through_speed">Full wiki page on loss of control via speed</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202662}" data-component-name="PollToDOM"></div><p></p><h2>Human non-alignment</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iy_y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iy_y!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 424w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 848w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 1272w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iy_y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png" width="382" height="386.46016483516485" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1473,&quot;width&quot;:1456,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!iy_y!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 424w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 848w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 1272w, https://substackcdn.com/image/fetch/$s_!iy_y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdce113aa-fc9d-4240-8ee7-4b8c21775785_1582x1600.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A utilitarian, a deep ecologist, and a Christian might agree on policy in the present world, but given arbitrary power their preferred futures might be a radical loss to the others. This isn't a problem of AI, but AI may cause us to face it much sooner than otherwise, before we have tools to navigate this situation.</figcaption></figure></div><p>Summary:</p><ol><li><p>People who broadly agree on good outcomes within the current world may, given much more power, choose outcomes that others would consider catastrophic</p></li><li><p>AI may empower some humans or human groups to bring about futures closer to what they would choose</p></li><li><p>From 1, that may be catastrophic according to the values of most other humans</p></li></ol><p>Selected counterarguments:</p><ul><li><p>Human values might be reasonably similar (possibly after extensive reflection)</p></li><li><p>This argument applies to anything that empowers humans. So it fails to show that AI is unusually dangerous among desirable technologies and efforts</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_competent_non-aligned_agents/start#discussion_of_this_argument_elsewhere">here</a>): Joe Carlsmith (<a href="https://joecarlsmith.com/2024/01/11/an-even-deeper-atheism">2024</a>), Katja Grace (<a href="https://www.alignmentforum.org/posts/LDRQ5Zfqwi8GjzPYG/counterarguments-to-the-basic-ai-x-risk-case">2022</a>), Scott Alexander (<a href="https://slatestarcodex.com/2018/09/25/the-tails-coming-apart-as-metaphor-for-life/">2018</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/will_malign_ai_agents_control_the_future/argument_for_ai_x-risk_from_variance_in_human_values">Full wiki page on the human non-alignment argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202663}" data-component-name="PollToDOM"></div><p></p><h2>Catastrophic tools</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7SOx!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7SOx!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 424w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 848w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 1272w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7SOx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png" width="382" height="324.7" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/967d7e76-111e-45f2-b567-658d960daae9_1140x969.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:969,&quot;width&quot;:1140,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!7SOx!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 424w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 848w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 1272w, https://substackcdn.com/image/fetch/$s_!7SOx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F967d7e76-111e-45f2-b567-658d960daae9_1140x969.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The BADGER nuclear explosion, April 18, 1953 at the Nevada Test Site. Leo Szilard realized nuclear chain reactions might be possible in 1933, five years before nuclear fission was discovered in 1938. A large surge of intelligent effort might uncover more potentially world-ending technologies in quick succession. (Image: <a href="https://commons.wikimedia.org/wiki/File:Operation_Upshot-Knothole_-_Badger_001.jpg">National Nuclear Security Administration</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>There appear to be non-AI technologies that would pose a risk to humanity if developed</p></li><li><p>AI will markedly increase the speed of development of harmful non-AI technologies</p></li><li><p>AI will markedly increase the breadth of access to harmful non-AI technologies</p></li><li><p>Therefore AI development poses an existential risk to humanity</p></li></ol><p>Selected counterarguments:</p><ul><li><p>It is not clear that developing a potentially catastrophic technology makes its deployment highly likely</p></li><li><p>New technologies that are sufficiently catastrophic to pose an extinction risk may not be feasible soon, even with relatively advanced AI</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/will_malign_ai_agents_control_the_future/argument_for_ai_x-risk_from_catastrophic_tools#discussions_of_this_argument_elsewhere">here</a>): Dario Amodei (<a href="https://www.judiciary.senate.gov/imo/media/doc/2023-07-26_-_testimony_-_amodei.pdf">2023</a>), Holden Karnofsky (<a href="https://www.openphilanthropy.org/research/potential-risks-from-advanced-artificial-intelligence-the-philanthropic-opportunity/">2016</a>), Yoshua Bengio (<a href="https://assets.publishing.service.gov.uk/media/6655982fdc15efdddf1a842f/international_scientific_report_on_the_safety_of_advanced_ai_interim_report.pdf">2024</a>).</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/will_malign_ai_agents_control_the_future/argument_for_ai_x-risk_from_catastrophic_tools">Full wiki page on the catastrophic tools argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202665}" data-component-name="PollToDOM"></div><p></p><h2>Powerful black boxes</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!oOXb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!oOXb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 424w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 848w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 1272w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!oOXb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png" width="382" height="277.919921875" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:745,&quot;width&quot;:1024,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!oOXb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 424w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 848w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 1272w, https://substackcdn.com/image/fetch/$s_!oOXb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c24c7-7cc6-4733-ad7c-6eae43e99fdc_1024x745.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A volunteer and a nurse in a Phase 1 clinical trial. We sometimes develop technology without fully understanding its mechanisms of action, e.g. in medicine, and so proceed cautiously. AI systems are arguably less well-understood and their consequences have higher stakes. (Image: <a href="https://commons.wikimedia.org/wiki/File:Clinical_trial_for_malaria_treatment_(49450846413).jpg">NIH Image gallery</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>So far, humans have developed technology largely through understanding relevant mechanisms</p></li><li><p>AI systems developed in 2024 are created via repeatedly modifying random systems in the direction of desired behaviors, rather than being manually built, so the mechanisms the systems themselves ultimately use are not understood by human developers</p></li><li><p>Systems whose mechanisms are not understood are more likely to produce undesired consequences than well-understood systems</p></li><li><p>If such systems are powerful, then the scale of undesired consequences may be catastrophic</p></li></ol><p>Selected counterarguments:</p><ul><li><p>It is not clear that developing technology without understanding mechanisms is so rare. We have historically incorporated many biological products into technology, and improved them, without deep understanding of all involved mechanisms</p></li><li><p>Even if this makes AI more likely to be dangerous, that doesn't mean the harms are likely to be large enough to threaten humanity</p></li></ul><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_powerful_black_boxes">Full wiki page on the powerful black boxes argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202666}" data-component-name="PollToDOM"></div><p></p><h2>Multi-agent dynamics</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4_IY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4_IY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 424w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 848w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 1272w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4_IY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png" width="382" height="319.24285714285713" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:980,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4_IY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 424w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 848w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 1272w, https://substackcdn.com/image/fetch/$s_!4_IY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa49555c7-a1bb-42d6-a102-571a2b906e1c_980x819.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Rabbits in Australia bred until the government stepped in, contrary to rabbit welfare (1938). Groups of entities often end up in scenarios that none of the members would individually choose, for instance because of the dynamics of competition. Prevalence of powerful AI may worsen this through heightening the intensity of competition. (Image: <a href="https://www.naa.gov.au/students-and-teachers/learning-resources/learning-resource-themes/environment-and-nature/conservation/rabbits-around-waterhole-during-myxomatosis-trial">National Archives of Australia</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>Competition can produce outcomes undesirable to all parties, through selection pressure for the success of any behavior that survives well, or through high stakes situations where well-meaning actors' best strategies are risky to all (as with nuclear weapons in the 20th Century)</p></li><li><p>AI will increase the intensity of relevant competitions</p></li></ol><p>Selected counterarguments:</p><ul><li><p>It's not clear what direction AI will have on the large number of competitive situations in the world</p></li></ul><p>People who have favorably discussed this argument (specific quotes <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_destructive_multi-agent_dynamics">here</a>): Robin Hanson (<a href="https://mason.gmu.edu/~rhanson/aigrow.pdf">2001</a>)</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_destructive_multi-agent_dynamics">Full wiki page on the multi-agent dynamics argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202667}" data-component-name="PollToDOM"></div><p></p><h2>Large impacts</h2><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KQgb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KQgb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 424w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 848w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 1272w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KQgb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png" width="382" height="238.22527472527472" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:908,&quot;width&quot;:1456,&quot;resizeWidth&quot;:382,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KQgb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 424w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 848w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 1272w, https://substackcdn.com/image/fetch/$s_!KQgb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F67a4794c-0636-499d-a1d8-f2ef7944a6c2_1600x998.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption">Replicas of <em>Ni&#241;a, Pinta</em> and<em> Santa Mar&#237;a</em> sail in 1893, mirroring Columbus' original transit 400 years earlier. Events with large consequences on many aspects of life are arguably more likely to have catastrophic consequences. (Image: <a href="https://commons.wikimedia.org/wiki/File:1893_Nina_Pinta_Santa_Maria_replicas.jpg">E. Benjamin Andrews</a>)</figcaption></figure></div><p>Summary:</p><ol><li><p>AI development will have very large impacts, relative to the scale of human society</p></li><li><p>Large impacts generally raise the chance of large risks</p></li></ol><p>Selected counterarguments:</p><ul><li><p>That AI will have large impacts is a vague claim, so it is hard to tell if it is relevantly true. For instance, 'AI' is a large bundle of technologies, so it might be expected to have large impacts. Many other large bundles of things will have 'large' impacts, for instance the worldwide continued production of electricity, relative to its ceasing. However we do not consider electricity producers to pose an existential risk for this reason</p></li><li><p>Minor changes frequently have large impacts on the world according to (e.g. the butterfly effect). By this reasoning, perhaps we should never leave the house</p></li></ul><p>People who have favorably discussed this argument (specific quotes<a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_large_impacts#discussion_elsewhere"> here</a>): Richard Ngo (<a href="https://www.thinkingcomplete.com/2019/01/disentangling-arguments-for-importance.html">2019</a>)&nbsp;</p><p>See also: <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/argument_for_ai_x-risk_from_large_impacts">Full wiki page on the large impacts argument</a></p><div class="poll-embed" data-attrs="{&quot;id&quot;:202668}" data-component-name="PollToDOM"></div><p></p><h2>Expert opinion</h2><p>Summary:</p><ul><li><p>The people best placed to judge the extent of existential risk from AI are AI researchers, forecasting experts, experts on AI risk, relevant social scientists, and some others</p></li><li><p>Median members of these groups frequently put substantial credence (e.g. <a href="https://static1.squarespace.com/static/635693acf15a3e2a14a56a4a/t/64f0a7838ccbf43b6b5ee40c/1693493128111/XPT.pdf">0.4%</a> to <a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">5%</a>) on human extinction or similar disempowerment from AI</p></li></ul><p>Selected counterarguments:</p><ul><li><p>Most of these groups do not have demonstrated skill at forecasting, and to our knowledge none have demonstrated skill at forecasting speculative events more than 5 years into the future</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AXfl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AXfl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 424w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 848w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 1272w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AXfl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png" width="1456" height="730" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:730,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AXfl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 424w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 848w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 1272w, https://substackcdn.com/image/fetch/$s_!AXfl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8ae3e6-1adb-439d-a919-5b7674812649_1600x802.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Some evidence for this argument comes from our<a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai"> 2023 Expert Survey on Progress in AI</a>. This graph shows 800 randomly selected responses on how good or bad the long-run impacts of 'high level machine intelligence' are expected to be for the future of humanity. Each vertical bar represents one participant's guess. The black section of each bar is the probability that participant put on 'extremely bad (e.g. human extinction)'.</figcaption></figure></div><div class="poll-embed" data-attrs="{&quot;id&quot;:202669}" data-component-name="PollToDOM"></div><p></p><div><hr></div><p>This is a snapshot of an <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_arguments_that_ai_poses_an_xrisk/start">AI Impacts wiki page</a>. For an up to date version, see there.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Each 'argument' here is intended to be a different line of reasoning, however they are often not pointing to independent scenarios or using independent evidence. Some arguments attempt to reason about the same causal pathway to the same catastrophic scenarios, but relying on different concepts. Furthermore, 'line of reasoning' is a vague construct, and different people may consider different arguments here to be equivalent, for instance depending on what other assumptions they make or the relationship between their understanding of concepts.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Nathan Young puts 80% that at the time of the quote the individual would have endorsed the respective argument. They may endorse it whilst considering&nbsp;another argument stronger or more complete.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Superintelligence, Chapter 8</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Human Compatible: Artificial Intelligence and the Problem of Control</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Paper Summary: Princes and Merchants: European City Growth Before the Industrial Revolution]]></title><description><![CDATA[Freer societies have faster economic growth.]]></description><link>https://blog.aiimpacts.org/p/paper-summary-princes-and-merchants</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/paper-summary-princes-and-merchants</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Mon, 15 Jul 2024 21:25:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>by J. Bradford de Long and Andrei Shleifer. (1993)</p><p><a href="https://scholar.harvard.edu/files/shleifer/files/princes_merchants.pdf">https://scholar.harvard.edu/files/shleifer/files/princes_merchants.pdf</a>.</p><p><strong>Summary: </strong>Freer societies have faster economic growth.&nbsp;</p><div><hr></div><blockquote><p>One of the oldest themes in economics is the incompatibility of despotism and development. Economies in which security of property is lacking &#8211; because of either the possibility of arrest, ruin, or execution at the command of the ruling prince or the possibility of ruinous taxation &#8211; should experience relative stagnation. By contrast, economies in which property is secure &#8211; either because of strong constitutional restrictions on the prince or because the ruling elite is made up of merchants rather than princes &#8211; should prosper and grow.</p></blockquote><p>Much of the work investigating the relationship between despotism and economic growth focuses on modern economies. The quantity of data available for modern economies is much greater than for earlier economies.&nbsp;</p><p>However, there are some limitations from focusing on only modern economies. Long time series are often impossible to compare, because most data is only available for maybe a century. The development trajectories of different countries, both economically and politically, are not independent from each other, so there are fewer independent data points than it might initially seem.&nbsp;</p><p>Modern economies might also have different development patterns which cannot be easily replicated elsewhere. Having an abundance of certain natural resources, like oil, can cause a modern economy to grow regardless of its institutions. This often isn&#8217;t a useful observation: `Have large hydrocarbon reserves&#8217; is not actionable advice. The development strategies useful for catching up to the technological frontier might be different from the development strategies useful for societies at the technological frontier.</p><p>Looking at earlier economies can allow us to avoid some of these problems. The time series can be much longer. The economic and political developments of different countries are less likely to be similar to each other over the entire time frame. Pre-industrial economies were all centered around food production, so there is less opportunity to build an economy based on resource extraction. The technological differences between countries (at least within Europe) were smaller, so catch-up growth is less relevant. Pre-industrial economies provide a set of examples with different limitations than modern economies.</p><p>De Long &amp; Shleifer compare different regions of Europe between 1050 and 1800. The growth of cities with a population of at least 30,000 acts as a proxy for economic growth. They divide the time frame into 6 periods, they divide central and western Europe into 9 regions, and they categorize governments into 8 types. All of these categories are listed in Table 1.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!scC9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!scC9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 424w, https://substackcdn.com/image/fetch/$s_!scC9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 848w, https://substackcdn.com/image/fetch/$s_!scC9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 1272w, https://substackcdn.com/image/fetch/$s_!scC9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!scC9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png" width="1040" height="489" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:489,&quot;width&quot;:1040,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:59608,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!scC9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 424w, https://substackcdn.com/image/fetch/$s_!scC9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 848w, https://substackcdn.com/image/fetch/$s_!scC9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 1272w, https://substackcdn.com/image/fetch/$s_!scC9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F847eb79c-cf2e-4733-a9db-bd6c7e7a8d0e_1040x489.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><strong>Table 1: </strong>Categories used by de Long &amp; Shleifer. The government types are ordered from most free to least free; the first 4 are considered &#8216;free&#8217; and the last 4 are considered &#8216;unfree.&#8217;</figcaption></figure></div><p>The main challenge with focusing on pre-industrial economies is that high quality data is much less available.&nbsp; Urban population is an imperfect proxy for economic development. There are several datasets that estimate the population of European cities over this time period, and they do not always agree with each other.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> There is also some arbitrariness in deciding which system of government was dominant in a region for more than a century. However, the effect size is large enough to be robust to these uncertainties.</p><p>This paper was written in 1993 and has over 1,000 citations. I have not looked at how more recent literature has responded to or developed since it was published. It seems to be a classic in its subfield.</p><p>There is a strong correlation between how free a society is and its urban growth (Figure 1).&nbsp;</p><blockquote><p>We find that, on average, for each century that such a region is free of government by an absolute prince, its total population living in cities of 30,000 or more inhabitants grew by 120,000, relative to a century of absolutist rule. This difference is larger than the average growth rate of urban populations in European regions between 1000 and 1800. In a purely statistical sense, therefore, the association between absolutism and slow city growth can more than account for why some western European regions had relatively low rates of urbanization in 1800, while others had flourishing cities and abundant commerce.</p></blockquote><p>Controlling for the region, the time period,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> and whether the city is an imperial capital makes the association even stronger.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tDjA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tDjA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 424w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 848w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 1272w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tDjA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png" width="616" height="485.04513064133016" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:663,&quot;width&quot;:842,&quot;resizeWidth&quot;:616,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tDjA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 424w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 848w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 1272w, https://substackcdn.com/image/fetch/$s_!tDjA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc0bee09-6f6a-4f54-82bd-9f55b279f158_842x663.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><strong>Figure 1: </strong>Partial scatter of change in the number of large cities against absolutist regimes. Note that this is controlling for region and time period.</figcaption></figure></div><p>Several of the particular examples make it seem likely that this is causal, rather than merely a correlation.</p><p>In 1050, southern and northern Italy were similarly urbanized. Over the course of the next century, their political systems diverged. Southern Italy was consolidated under an unusually centralized Norman kingdom. The cities of northern Italy became increasingly independent from the Holy Roman Empire through the Investiture Controversy and Lombard League.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> The cities of northern Italy outpaced the growth of the cities of southern Italy over the next few centuries. Around 1500, the kingdoms of France and Spain reduced the independence of most of the Italian city-republics in the Italian Wars, and the urbanization of northern Italy slowed.</p><p>Prior to about 1500, the Low Countries were ruled by the Duke of Burgundy, a weak feudal government with checks on its power. These regions were then transferred to the Habsburgs, an increasingly bureaucratic absolute monarchy. The northern part of the Low Countries rebelled and formed the Dutch Republic. In 1500, the urbanization of the southern (Belgium) and northern (Netherlands) parts of the Low Countries were similar. They diverged dramatically afterwards. Urbanization in the Netherlands increased and they became a global economic power, while Belgium stagnated.</p><p>For most of the time between 1050 and 1800, England was an unusually centralized and bureaucratic kingdom. Around 1650, the English Civil War and Glorious Revolution established the supremacy of Parliament on the most important issues. England had been economically underdeveloped prior to 1650, but afterwards saw rapid economic growth culminating in the Industrial Revolution.</p><p>While the existence of the correlation is not surprising, it is surprising how clear the effect is. Modern economies do not have this strong of a correlation between freedom and economic growth. With longer time frames and with fewer opportunities for technological catch-up or economies based on resource extraction, the long-term structural differences are more apparent. Medieval and early modern free societies had more economic development than their less free counterparts.</p><blockquote><p>From the perspective of the people alive at the time, or of the long-term growth of the economy, princely success is economic failure. For the people of southern Italy, the creation of the d&#8217;Hauteville <em>regno</em> was no blessing; for the people of Belgium, their incorporation into the Habsburg Empire was no benefit; for the people of Iberia, the marriage of Ferdinand and Isabella was no cause for rejoicing. The rise of an absolutist government and the establishment of princely authority are, from a perspective that values economic growth, events to be mourned and not celebrated.</p></blockquote><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>The largest errors seem to be about a factor of 3, for a few of the largest Muslim cities in Europe in 1050.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Some regions might systematically have faster city growth, independent of the institutions. For example, Habsburg Spain had access to tremendous wealth from the New World, while Habsburg Austria did not. Some time periods also had systematically faster city growth. For example, 1330-1500 had the Black Death, while 1500-1650 had an influx of wealth from the New World.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>That there is a lot of history here that I am barely mentioning. The other specific examples&#8217; histories are similarly abbreviated.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Paper Summary: The Effects of Communicating Uncertainty on Public Trust in Facts and Numbers]]></title><description><![CDATA[Numerically expressing uncertainty when talking to the public is fine. It causes people to be less confident in the number itself (as it should), but does not cause people to lose trust in the source of that number.]]></description><link>https://blog.aiimpacts.org/p/paper-summary-the-effects-of-communicating</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/paper-summary-the-effects-of-communicating</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Tue, 09 Jul 2024 16:44:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!DGIX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>by Anne Marthe van der Bles, Sander van der Linden, Alexandra L. J. Freeman, and David J. Spiegelhalter. (2020)</p><p><em><a href="https://www.pnas.org/doi/pdf/10.1073/pnas.1913678117">https://www.pnas.org/doi/pdf/10.1073/pnas.1913678117</a>.</em></p><p><strong>Summary: </strong>Numerically expressing uncertainty when talking to the public is fine. It causes people to be less confident in the number itself (as it should), but does not cause people to lose trust in the source of that number.</p><div><hr></div><blockquote><p>Uncertainty is inherent to our knowledge about the state of the world yet often not communicated alongside scientific facts and numbers. In the &#8220;posttruth&#8221; era where facts are increasingly contested, a common assumption is that communicating uncertainty will reduce public trust. However, a lack of systematic research makes it difficult to evaluate such claims.</p></blockquote><p>Within many specialized communities, there are norms which encourage people to state numerical uncertainty when reporting a number. This is not often done when speaking to the public. The public might not understand what the uncertainty means, or they might treat it as an admission of failure. Journalistic norms typically do not communicate the uncertainty.</p><p>But are these concerns actually justified? This can be checked empirically.&nbsp; Just because a potential bias is conceivable does not imply that it is a significant problem for many people. This paper does the work of actually checking if these concerns are valid.</p><p>Van der Bles et al. ran five surveys in the UK with a total n = 5,780. A brief description of their methods can be found in the appendix below.</p><p>Respondents&#8217; trust in the numbers varied with political ideology, but how they reacted to the uncertainty did not.</p><p>People were told the number either without mentioning uncertainty (as a control), with a numerical range, or with a verbal statement that uncertainty exists for these numbers. The study did not investigate stating p-values for beliefs. Exact statements used in the survey can be seen in Table 1, in the appendix.</p><p>The best summary of their data is in their Figure 5, which presents results from surveys 1-4. The fifth survey had smaller effect sizes, so none of the shifts in trust were significant.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DGIX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DGIX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 424w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 848w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 1272w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DGIX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png" width="502" height="379.1145833333333" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:725,&quot;width&quot;:960,&quot;resizeWidth&quot;:502,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DGIX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 424w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 848w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 1272w, https://substackcdn.com/image/fetch/$s_!DGIX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcb05fa5-ea43-4108-b663-baaec127c1a2_960x725.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DupO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DupO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 424w, https://substackcdn.com/image/fetch/$s_!DupO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 848w, https://substackcdn.com/image/fetch/$s_!DupO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 1272w, https://substackcdn.com/image/fetch/$s_!DupO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DupO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png" width="504" height="367.3306451612903" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:723,&quot;width&quot;:992,&quot;resizeWidth&quot;:504,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DupO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 424w, https://substackcdn.com/image/fetch/$s_!DupO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 848w, https://substackcdn.com/image/fetch/$s_!DupO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 1272w, https://substackcdn.com/image/fetch/$s_!DupO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d4924b9-6e7a-427f-970e-23ecd573fa82_992x723.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!r4vB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!r4vB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 424w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 848w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 1272w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!r4vB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png" width="492" height="381.5809334657398" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:781,&quot;width&quot;:1007,&quot;resizeWidth&quot;:492,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!r4vB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 424w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 848w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 1272w, https://substackcdn.com/image/fetch/$s_!r4vB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7eb8b520-515b-4f39-a6a9-83e66c17f029_1007x781.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Expressing uncertainty made it more likely that people perceived uncertainty in the number (A). This is good. When the numbers are uncertain, science communicators should want people to believe that they are uncertain. Interestingly, verbally reminding people of uncertainty resulted in higher perceived uncertainty than numerically stating the numerical range, which could mean that people are overestimating the uncertainty when verbally reminded of it.</p><p>The surveys distinguished between trust in the number itself (B) and trust in the source (C). Numerically expressing uncertainty resulted in a small decrease in the trust of that number. Verbally expressing uncertainty resulted in a larger decrease in the trust of that number. Numerically expressing uncertainty resulted in no significant change in the trust of the source. Verbally expressing uncertainty resulted in a small decrease in the trust of the source. </p><p>The consequences of expressing numerical uncertainty are what I would have hoped: people trust the number a bit less than if they hadn&#8217;t thought about uncertainty at all, but don&#8217;t think that this reflects badly on the source of the information.</p><blockquote><p>Centuries of human thinking about uncertainty among many leaders, journalists, scientists, and policymakers boil down to a simple and powerful intuition: &#8220;No one likes uncertainty.&#8221; It is therefore often assumed that communicating uncertainty transparently will decrease public trust in science. In this program of research, we set out to investigate whether such claims have any empirical basis.</p></blockquote><p>The answer is mostly no. Good epistemic practice is not bad journalistic practice. When you give people numerical estimates of uncertainty of a number, they respond the way they should. The perceived confidence in the number itself goes down, while the trust in the source does not. Verbally reminding people of uncertainty seems like a worse practice: it causes people to distrust the source of information and seems to cause them to overestimate the uncertainty in the number. Expressing no uncertainty seems to make people overconfident in the number reported.</p><p>It is better to use good epistemics when talking to the public than it is to try to correct for their bad epistemics with compromised epistemics of your own.</p><blockquote><p>The high degree of consistency in our results, across topics, magnitudes of uncertainty, and communication formats suggest that people &#8220;can handle the truth.&#8221;</p></blockquote><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h3>Appendix: Survey Methods</h3><p>Of the five surveys involved in this paper, the first three each had about 1,000 participants in the UK recruited using the platform Prolific, and were paid &#163;1.20 to complete a 2 minute survey. The fourth survey was a preregistered replication with 1,050 adults in the UK recruited using the platform Qualtrics Panel. The fifth survey was a field experiment done with <em>BBC News</em>. When <em>BBC News</em> reported new labor market statistics on October 15, 2019, they ran three different versions of the article, and included a link to the survey in the article. There were 1,700 people who completed this survey.&nbsp;</p><p>Each survey presented readers with a measured number of some statistic: the number of unemployed people in the UK, the net number of migrants between the EU and UK, the amount the Earth&#8217;s average global temperature increased between 1880 &amp; 2012, and the number of tigers in India. Some of these are more partisan issues than others in the UK, and the error bars are different sizes relative to the size of the number.&nbsp;</p><p>Table 1 shows some of the exact statements used in the 3rd and 4th surveys.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KF-D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KF-D!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 424w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 848w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 1272w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KF-D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png" width="1325" height="673" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:673,&quot;width&quot;:1325,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KF-D!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 424w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 848w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 1272w, https://substackcdn.com/image/fetch/$s_!KF-D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb10d4f89-9274-4917-8f75-1691bbaf768e_1325x673.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The participants were somewhat more educated and more liberal than the general public in all five of the surveys, and the first three surveys were somewhat younger and more female.</p>]]></content:encoded></item><item><title><![CDATA[Advice for Activists from the History of Environmentalism]]></title><description><![CDATA[Other movements should try to avoid becoming as partisan as the environmental movement. Partisanship did not make environmentalism more popular, it made legislation more difficult to pass, and it resulted in fluctuating executive action. Looking at the history of environmentalism can give insight into what to avoid in order to stay bipartisan.]]></description><link>https://blog.aiimpacts.org/p/advice-for-activists-from-the-history</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/advice-for-activists-from-the-history</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Thu, 16 May 2024 18:35:08 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/a919142a-0609-4fe0-80ed-3d0cfefcec95_1024x1024.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is the fourth in a sequence of posts taken from my recent report: <a href="http://aiimpacts.org/wp-content/uploads/2023/04/Why-Did-Environmentalism-Become-Partisan-1.pdf">Why Did Environmentalism Become Partisan?</a></em></p><p><em>This post has more of my personal opinions than previous posts or the report itself.</em></p><div><hr></div><p>Other movements should try to avoid becoming as partisan as the environmental movement. Partisanship did not make environmentalism more popular, it made legislation more difficult to pass, and it resulted in fluctuating executive action. Looking at the history of environmentalism can give insight into what to avoid in order to stay bipartisan.</p><p>Partisanship was not inevitable. It occurred as the result of choices and alliances made by individual decision makers. If they had made different choices, environmentalism could have ended up being a bipartisan issue, like it was in the 1980s and is in some countries in Europe and democratic East Asia.</p><p>Environmentalists were not the only people making significant decisions here. Fossil fuel companies and conservative think tanks also had agency in the debate &#8211; and their choices were more blameworthy than the choices of environmentalists. Politicians choose who they do and do not want to ally with. My focus is on the environmental movement itself, because that is similar to what other activist groups are able to control.</p><p>I am more familiar with the history of the environmental movement than with most other social movements. The environmental movement is particularly interesting because it involves an important global issue that used to be broadly popular, but has since become very partisan and less effective at enacting policy in the United States. It nevertheless can be risky to over-update on a single case study. Much of the advice given here has support in the broader social movements literature, but the particulars are based on the history of one movement.</p><p>With those caveats aside, let&#8217;s look at what we can learn.</p><div><hr></div><p>Here is a list of advice I have gleaned from this history:</p><ol><li><p><strong>Make political alliances with individuals and institutions in both political parties.</strong><br><br>This is the most important advice.<br><br>Allying with the Democratic Party might have seemed like a natural choice at the time. Climate scientists might have already leaned left, and so found allying with Democrats to be more natural &#8211; although the evidence for this is weak. Al Gore was committed to their cause, and was rapidly building political influence: from Representative to Senator to Vice President, and almost to President. <br><br>The mistake was not simultaneously pursuing alliances with rising Republicans as well. At the time, it would not have been too difficult to find some who were interested. <br><br>Building relationships with both parties involves recruiting or persuading staffers for both Democratic and Republican congressmen and analysts for both conservative and liberal think tanks. Personal relationships with individuals and institutions often matter more than the implications of a fully consistent ideology.<br></p></li><li><p><strong>Don&#8217;t give up on one side once partisanship starts to be established.</strong><br><br>I wouldn&#8217;t be surprised if some environmentalists in the late 1990s or 2000s thought that the issue was already partisan, so it didn&#8217;t matter that they were only working with one side. They were wrong. Partisanship could and did continue to get worse. Environmentalism is now one of the, if not the, most partisan issue in the country.<br><br>In 1995, after Newt Gingrich had won control of the House of Representatives opposing the BTU tax, there was still only one conservative think tank that regularly promoted climate skepticism. Environmentalists might have been able to gain influence at other conservative think tanks to weaken the reframing efforts of fossil fuel companies.<br><br>In 2006, Al Gore&#8217;s documentary <em>An Inconvenient Truth</em> did not change the opinions of the public overall, but did encourage a new generation of activists. He might have been able to reduce the partisan effect of the documentary by collaborating with a prominent Republican who supported climate policies, like Schwarzenegger or McCain. <br><br>Ongoing decisions by environmentalists and their allies continued to reinforce the partisan divide.<br></p></li><li><p><strong>Proposing flawed legislation, and losing the resulting legislative battle, seems quite bad.</strong><br><br>There were two key legislative defeats as environmentalism started to become partisan: the BTU tax in 1993 and the Kyoto Protocol in 1997. <br><br>In both cases, the legislation seems poorly designed. The BTU tax focused on energy, not greenhouse gasses, with exemptions for favored industries. The Kyoto Protocol had already been rejected by the entire Senate. <br><br>Unpopular legislation proposed by environmentalists and their allies made it easier for other politicians to rally against environmentalism. <br><br>Drafting good legislation is important both to get what you actually want enacted and to not offer as many opportunities for others&#8217; attacks.<br></p></li><li><p><strong>Be cautious and intentional about mission creep.</strong><br><br>Mission creep is the gradual expansion of an institution&#8217;s or a movement&#8217;s goals beyond their original intention. For an advocacy group focusing on a complex issue, some mission creep is inevitable: as your understanding of the problems grows, there should be some changes to the goals you are pursuing to address these problems.<br><br>Mission creep can also involve expanding your goals to include goals of your current political allies, even if they are not directly related to the original intention. This seems bad. Environmental organizations today promote liberal positions on many other policy issues and reliably endorse one political party.<br><br>If the organizations in a movement endorse controversial positions aligned with one party, it should not be surprising if many people associate them with that party. Allowing mission creep makes it harder to build bipartisan coalitions. There are more people who agree with you on environmental issues than there are people who agree with you on environmental issues and abortion and the Israeli-Palestinian conflict and &#8230; .<br><br>Your movement should try to avoid having public opinions on most issues and only focus on the issues central to your original intention.<br></p></li><li><p><strong>Focusing on local issues makes it easier to form idiosyncratic partnerships that cut across party lines.</strong><br><br>In its first few decades, the modern environmental movement focused primarily on local concerns: air pollution in Los Angeles, the Cuyahoga River fire, the proposed Bodega Bay nuclear power plant, and proposed dams in the Grand Canyon. In the late 1980s, much of the attention of the environmental movement shifted towards climate change, an inherently global phenomenon. This does not reflect public opinion, which seems to be more concerned with local environmental issues than climate change.<br><br>Local politics in the US is less partisan than national politics.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> It is often not obvious how the national parties would respond to specific local questions, so there are fewer elite cues to divide people into partisan camps. Different localities compete with each other for population and economic activity, and so respond to where people are choosing to live in addition to how people vote. For these reasons, local issues often involve idiosyncratic partnerships cutting across party lines.<br><br>The environmental movement&#8217;s shift from local issues to one international issue made it easier for it to become consistently tied to one political party.<br><br>There is some reason to have caution here. You do not want a particular local partnership to turn into an alliance that defines your movement. This feels like a solvable problem by not becoming too committed to local partnerships and managing mission creep well.<br></p></li><li><p><strong>Getting messaging right seems hard.</strong><br><br>Both underselling and overselling your arguments seems like it could have bad results. Either of them seems like they could undermine public trust in your expertise. <br><br>Explicitly stating numerical uncertainty to the public is fine, and does not cause people to trust you less.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> Telling policy makers both your politically plausible asks and your more ambitious hopes also seems fine.<br><br>Failing to distinguish between empirical and normative claims could be effective in the short term: if people accept the validity of the empirical claims, conflating them with policy proposals can make it easier to get these policies enacted. It seems counterproductive in the longer term: if people do not accept your policy goals, it can also make them more dismissive of your empirical claims.<br><br>I am of the opinion that you should use good epistemics when talking to the public or policy makers, rather than using bad epistemics to try to be more persuasive. Most subject matter experts are not also experts in public messaging, and so typically do not know how to use effective rhetoric and narrative-crafting. Being publicly revealed to have been dishonest to the public seems like it damages trust much more than using good epistemics in a not rhetorically optimal way. I would rather have a reputation as someone who trusts the public and policy makers to understand my key points, rather than as someone who looks down on their ability to understand what I&#8217;m worried about.<br><br>It is unclear whether any one actor could have dramatically improved the messaging, or if that would have required an unrealistic amount of discipline within the movement. It was not hard for activists on either side to find climate scientists who were willing to confidently argue their position to the public.</p></li></ol><div><hr></div><p>To me, the AI safety movement feels sort of like environmentalism in the 1960s or climate change in the 1980s. The movement is still really young. Most of the public is still uncertain what to think about it.</p><p>Despite this uncertainty, a decent amount of the public seems to support the goals of the AI safety movement. Polls indicate that many people are skeptical that AI will have a positive impact on society, and that some amount of government regulation is broadly popular.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>This does not inherently imply that the AI safety movement will succeed if, for example, it proposes a ballot measure for the next election. The public is still more uncertain than supportive. The details of the proposal need to be proposed and promoted. Various leaders and groups may respond in unpredictable ways. Public opinion might look very different after a major political push than it did before. But I do think that these polls indicate that there is latent public support that the AI safety community could develop in support of its policy goals.</p><p>When trying to build this latent public support, it is important to cast as wide of a net as possible. Many different people might be interested in and willing to support the AI safety movement &#8211; including people who are culturally very different from the people who are currently working on AI safety. The movement should try to build relationships with as varied a group of people as possible.&nbsp;</p><p>A broad bipartisan movement would be more effective at enacting policy than a movement closely allied to one political party.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>&nbsp;Amalie Jensen, William Marble, Kenneth Scheve, &amp; Matthew J. Slaughter. <em>City limits to partisan polarization in the American public. </em>Political Science Research and Methods <strong>9</strong>. (2021) p. 223&#8211;241. <a href="https://static1.squarespace.com/static/5b74a2ebfcf7fda680a56b29/t/63bdb31d5fbd7153248b5f47/1673376544024/JensenEtAl_PSRM_2021.pdf">https://static1.squarespace.com/static/5b74a2ebfcf7fda680a56b29/t/63bdb31d5fbd7153248b5f47/1673376544024/JensenEtAl_PSRM_2021.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Anne Marthe van der Blesa, Sander van der Lindena, Alexandra L. J. Freemana, &amp; David J. Spiegelhalter. <em>The effects of communicating uncertainty on public trust in facts and numbers. </em>Proceedings of the National Academy of Sciences <strong>117.14</strong>. (2020) p. 7672-7683. <a href="https://www.pnas.org/doi/pdf/10.1073/pnas.1913678117">https://www.pnas.org/doi/pdf/10.1073/pnas.1913678117</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p><em>Surveys of US public opinion on AI. </em>AI Impacts Wiki. (Accessed: May 8, 2024) <a href="https://wiki.aiimpacts.org/responses_to_ai/public_opinion_on_ai/surveys_of_public_opinion_on_ai/surveys_of_us_public_opinion_on_ai">https://wiki.aiimpacts.org/responses_to_ai/public_opinion_on_ai/surveys_of_public_opinion_on_ai/surveys_of_us_public_opinion_on_ai</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Was Partisanship Good for the Environmental Movement?]]></title><description><![CDATA[Rising partisanship did not make environmentalism more popular or politically effective. Instead, it saw flat or falling overall public opinion, fewer major legislative achievements, and fluctuating executive actions.]]></description><link>https://blog.aiimpacts.org/p/was-partisanship-good-for-the-environmental</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/was-partisanship-good-for-the-environmental</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Wed, 15 May 2024 16:49:49 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is the third in a sequence of posts taken from my recent report: <a href="http://aiimpacts.org/wp-content/uploads/2023/04/Why-Did-Environmentalism-Become-Partisan-1.pdf">Why Did Environmentalism Become Partisan?</a></em></p><h2>Summary</h2><p>Rising partisanship did not make environmentalism more popular or politically effective. Instead, it saw flat or falling overall public opinion, fewer major legislative achievements, and fluctuating executive actions.</p><h2>Public Opinion</h2><p>One hypothesis is that partisanship was useful, or even necessary, for an issue to become popular. Maybe journalists never would have covered the story if it did not involve an exciting partisan contest. The public then might have never realized that this is a thing they could care about.</p><p>The polling data do not support this hypothesis.</p><p>The clearest data from McCright et al.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Over 70% of both parties, and both ideologies, supported more government spending on the environment in 1990. Then, over the next 20 years, Republicans&#8217; support for environmental spending fell dramatically while Democrat support remained roughly constant. These polls show declining overall support as partisanship increased.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vTOW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vTOW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 424w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 848w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 1272w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vTOW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png" width="814" height="475" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:475,&quot;width&quot;:814,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vTOW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 424w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 848w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 1272w, https://substackcdn.com/image/fetch/$s_!vTOW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce75127f-d952-4f8c-b0a7-70d28f8dc5ad_814x475.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 1: </strong>Percentages of Democrats and Republicans reporting that national spending on the environment is &#8220;Too Little,&#8221; 1974-2012. Reprinted from McCright et al. (2014).</em></p><p>The Gallup data is more ambiguous.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> It starts later, so it cannot show what was happening in the 1990s. For four different questions about global warming, Gallup shows roughly flat support from 2001-2021, although there have been fluctuations in the level of support.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!x2Fu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!x2Fu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 424w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 848w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 1272w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!x2Fu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png" width="720" height="391" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:391,&quot;width&quot;:720,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!x2Fu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 424w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 848w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 1272w, https://substackcdn.com/image/fetch/$s_!x2Fu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4175b951-3f48-4e51-a0ea-f637abd61fd6_720x391.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 2: </strong>Percentage of Americans who agree with four different statements about global warming. Reprinted from Gallup (2021).</em></p><p>Public opinion is not quite the same thing as public attention. Maybe partisanship increases how much people are talking about an issue, even if it has little impact on the support as measured in polls. As a proxy for public attention, I will use Google Book&#8217;s Ngram, which shows how frequently phrases appear in a corpus of English language books published each year.</p><p>For many environmental issues, there was a peak in the early 1990s. Public attention was growing prior to the issue becoming partisan and declining once the issue had become partisan. This might be because the environmental movement itself shifted focus away from these issues. &#8216;Pollution&#8217; shows a similar peak around 1990, but also a larger peak in the early 1970s. &#8216;Climate change&#8217; shows a somewhat different pattern: rapid growth before 1990, which then levels off until the mid-2000s, followed by resumed growth. It is possible to have increasing public attention while a topic is highly partisan, but for all of these environmental issues, public attention was flat or falling while partisanship was becoming established in the 1990s and early 2000s. Increasing partisanship does not seem to be a reliable way to attract public attention.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bwTS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bwTS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 424w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 848w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 1272w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bwTS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png" width="1410" height="489" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:489,&quot;width&quot;:1410,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bwTS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 424w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 848w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 1272w, https://substackcdn.com/image/fetch/$s_!bwTS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55c2ecc2-b5ab-4291-b206-2ee3cedd3389_1410x489.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 3a: </strong>Frequency of the phrases &#8216;ozone,&#8217; &#8216;deforestation,&#8217; &#8216;endangered species,&#8217; and &#8216;acid rain&#8217; in books each year from 1945-2019. From <a href="https://books.google.com/ngrams/graph?content=acid+rain%2Cdeforestation%2Cendangered+species%2Cozone&amp;year_start=1945&amp;year_end=2019&amp;corpus=en-2019&amp;smoothing=0">Google Ngram Viewer</a>.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!oekg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!oekg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 424w, https://substackcdn.com/image/fetch/$s_!oekg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 848w, https://substackcdn.com/image/fetch/$s_!oekg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 1272w, https://substackcdn.com/image/fetch/$s_!oekg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!oekg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png" width="1391" height="479" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:479,&quot;width&quot;:1391,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!oekg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 424w, https://substackcdn.com/image/fetch/$s_!oekg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 848w, https://substackcdn.com/image/fetch/$s_!oekg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 1272w, https://substackcdn.com/image/fetch/$s_!oekg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d61f239-23bb-4522-9ef9-0701b0e7bb2e_1391x479.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 3b: </strong>Frequency of the phrases &#8216;pollution,&#8217; &#8216;air pollution,&#8217; and &#8216;water pollution&#8217; in books each year from 1945-2019. From <a href="https://books.google.com/ngrams/graph?content=pollution%2Cair+pollution%2Cwater+pollution&amp;year_start=1945&amp;year_end=2019&amp;corpus=en-US-2019&amp;smoothing=0">Google Ngram Viewer</a>.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GDrq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GDrq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 424w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 848w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 1272w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GDrq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png" width="1391" height="490" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:490,&quot;width&quot;:1391,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GDrq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 424w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 848w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 1272w, https://substackcdn.com/image/fetch/$s_!GDrq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7716e983-ac3b-4cdc-9f95-a34a272599e3_1391x490.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 3c: </strong>Frequency of the phrases &#8216;climate change,&#8217; &#8216;global warming,&#8217; and &#8216;climatic change&#8217; in books each year from 1945-2019. From <a href="https://books.google.com/ngrams/graph?content=global+warming%2Cclimate+change%2Cclimatic+change&amp;year_start=1945&amp;year_end=2019&amp;corpus=en-2019&amp;smoothing=0">Google Ngram Viewer</a>.</em></p><p>Republicans&#8217; opposition to climate change does not seem to be a result of a lack of information. Two polls in swing states in 2011 indicated that Republicans and Democrats with less education, or who said they know little about climate change, have similar views. As education and knowledge increase, Democrats became more concerned about climate change, while Republicans became less concerned.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> This result is consistent with the overall trends: as time goes on, and people become more familiar with climate change, it exacerbates the partisan divide without increasing overall support.</p><h2>Legislation</h2><p>Many of the goals of environmentalism cannot be directly achieved by public opinion: they require new legislation.</p><p>To determine if partisanship has made it easier or harder to pass legislation, I investigated when pieces of major environmental legislation have been passed.</p><p>What counts as &#8216;major&#8217; environmental legislation?&nbsp;</p><p>The Congressional Research Service produced a summary of the &#8220;major statutes administered by the EPA&#8221; in 2013<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> and the Library of Congress has a research guide about &#8220;significant legislation governing environmental law and policy.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> Any law mentioned in either of these is included.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> I categorize these laws as dealing with environmental impact statements, air pollution, water pollution, solid waste, toxic substances, or endangered species. All of these are domestic laws &#8211; treaties will be discussed below.</p><p>These laws often create the framework of U.S. environmental policy. Newer legislation might not need to create a new framework, and instead could be an amendment to the existing framework. In addition, some of these laws superseded earlier environmental legislation. To account for both of these possibilities, I turned to Wikipedia. The sidebar of the Wikipedia article for each of these laws has a list of &#8220;major amendments.&#8221; I include each of these as a piece of major environmental legislation, checking to make sure there is no double counting.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> I also read the History section to check for any major precursors. Each law has a well-developed article. I trust their judgment for what counts as a &#8220;major amendment&#8221; and what precursors should be mentioned more than a search and categorization I might do specifically for this report.</p><p>A graph of the number of pieces of major environmental legislation and amendments since 1945 is shown in Figure 4, binned into 4 year intervals corresponding to presidential terms.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X15n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X15n!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!X15n!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!X15n!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!X15n!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X15n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png" width="1200" height="742" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6c496430-f63c-4764-a07a-072047643f91_1200x742.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:742,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X15n!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!X15n!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!X15n!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!X15n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c496430-f63c-4764-a07a-072047643f91_1200x742.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em><strong>Figure 4: </strong>The number of major pieces of domestic environmental legislation, including amendments and precursors to existing legislation, since 1945. Each four year bin corresponds to a presidential term.</em></p><p>There was clearly more environmental legislation passed in the 1960s-80s than in earlier or later decades. The modern environmental movement began in the 1960s and became legislatively successful within the decade. The environmental movement became partisan in the 1990s, and stopped being as capable of passing major legislation.&nbsp;</p><p>Treaties show a similar, if sparser, pattern. During the 1970s and 1980s, the Senate ratified four international environmental treaties unanimously.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> The U.N. Framework Convention on Climate Change was also ratified using a division vote in 1992.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> Between 1989 and 2001, the United States signed four environmental treaties which the Senate refused to ratify, including the Kyoto Protocol.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> More recently, international environmental agreements have been structured so they do not require ratification from the Senate &#8211; like the Paris Climate Accords.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a></p><p>Increasing partisanship has made it more difficult for the environmental movement to pass its legislative agenda.</p><h2>Executive Actions</h2><p>Environmental policy can also be enacted by the executive branch. When environmentalism is partisan, executive action fluctuates with broader political winds.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a> Over the medium-to-long term, it is unlikely that one political party will consistently win elections. Having bipartisan support for an issue results in more stable and reliable executive policies.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a></p><p>This can be most clearly seen in international agreements. The Clinton administration negotiated the Kyoto Protocol, the Bush administration declared that they would not implement it, the Obama administration negotiated the Paris Accords, the Trump administration withdrew from them, and the Biden administration rejoined them.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a></p><p>Much of the implementation of environmental policy occurs in the executive branch, especially the EPA. While Republican administrations have not been effective at reducing the size or budget of the EPA,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> they have chosen leaders who have closer ties to fossil fuel companies than to the environmental movement. Many of the details of how to enact policy are small enough to not be widely reported, but some are. Examples include changing rules for air pollution<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> or removing &#8216;climate change&#8217; from the EPA&#8217;s website.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a> These changes can be reversed by future Democratic administrations, but the result is that environmental protections are not consistently applied.</p><h2>Conclusion</h2><p>Environmentalism was a very successful movement from the 1960s-1980s. It convinced a majority of people in both parties of the importance of its concerns. It was effective at passing domestic legislation and negotiating international treaties. Presidents from both parties supported the movement.</p><p>The rise in partisanship starting in the 1990s was bad for environmentalism. Some presidential administrations are now hostile to the movement. Major legislation is still sometimes passed, but much less frequently than before. Overall public support did not increase as environmentalism became partisan.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Aaron M. McCright, Chenyang Xiao, &amp; Riley E. Dunlap. <em>Political polarization on support for government spending on environmental protection in the USA, 1974-2012. </em>Social Science Research <strong>48</strong>. (2014) p. 251-260. <a href="https://www.sciencedirect.com/science/article/abs/pii/S0049089X1400132X">https://www.sciencedirect.com/science/article/abs/pii/S0049089X1400132X</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Lydia Saad. <em>Global Warming Attitudes Frozen Since 2016. </em>Gallup. (2021) <a href="https://news.gallup.com/poll/343025/global-warming-attitudes-frozen-2016.aspx">https://news.gallup.com/poll/343025/global-warming-attitudes-frozen-2016.aspx</a>.</p><p>Note that there are several similar questions, all which show a small or zero partisan gap when the data starts, which grows dramatically in time.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Lawrence C. Hamilton. <em>Education, politics and opinions about climate change: Evidence for interaction effects. </em>Climatic Change <strong>104</strong>. (2011) p.231&#8211;242. <a href="https://scholars.unh.edu/cgi/viewcontent.cgi?article=1388&amp;context=soc_facpub">https://scholars.unh.edu/cgi/viewcontent.cgi?article=1388&amp;context=soc_facpub</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p><em>Environmental Laws: Summaries of Major Statutes Administered by the Environmental Protection Agency</em>. Congressional Research Service. (2013) <a href="https://crsreports.congress.gov/product/pdf/RL/RL30798">https://crsreports.congress.gov/product/pdf/RL/RL30798</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p><em>Environmental Law: A Beginner's Guide.</em> Library of Congress: Research Guides. (Accessed April 29, 2024) <a href="https://guides.loc.gov/environmental-law/federal-laws">https://guides.loc.gov/environmental-law/federal-laws</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>&nbsp;Here is the list of laws mentioned, arranged by category:</p><ul><li><p>Impact Statements:</p><ul><li><p>National Environmental Policy Act</p></li></ul></li><li><p>Air Pollution:</p><ul><li><p>Clean Air Act</p></li></ul></li><li><p>Water Pollution:</p><ul><li><p>Clean Water Act</p></li><li><p>Marine Protection, Research, and Sanctuaries Act</p></li><li><p>Safe Drinking Water Act</p></li><li><p>Oil Pollution Act</p></li></ul></li><li><p>Solid Waste:</p><ul><li><p>Solid Waste Disposal Act</p></li><li><p>Resource Conservation and Recovery Act</p></li><li><p>Comprehensive Environmental Response, Compensation, and Liability Act</p></li></ul></li><li><p>Toxic Substances:</p><ul><li><p>Toxic Substances Control Act</p></li><li><p>Federal Insecticide, Fungicide, and Rodenticide Act</p></li><li><p>Pollution Prevention Act</p></li><li><p>Emergency Planning and Community Right-to-Know Act</p></li></ul></li><li><p>Endangered Species:</p><ul><li><p>Endangered Species Act</p></li></ul></li></ul></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>For example, the Resource Conservation and Recovery Act also appears as a major amendment to the Solid Waste Disposal Act.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>The treaties are:</p><ul><li><p>Convention on International Trade in Endangered Species of Wild Fauna and Flora</p></li><li><p>International Convention for the Prevention of Pollution from Ships, 1973 as modified by the Protocol of 1978</p></li><li><p>Convention on Long-Range Transboundary Air Pollution</p></li><li><p>Montreal Protocol on Substances That Deplete the Ozone Layer</p></li></ul></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>The UNFCCC was ratified using a division vote, in which Senators stand for &#8220;yea&#8221; and &#8220;nay&#8221; and the presiding officer counts the number of Senators standing for each. The result of the vote is not recorded other than whether it passed. Treaties require 2/3 support of the Senate to be ratified, so it had to have had significant bipartisan support. Typically, division votes and voice votes are used when the result of the vote is not in doubt beforehand.</p><p><em>About Voting. </em>U.S. Senate. (Accessed March 22, 2024) <a href="https://www.senate.gov/about/powers-procedures/voting.htm">https://www.senate.gov/about/powers-procedures/voting.htm</a>.</p><p><em>United Nations Framework Convention on Climate Change. </em>Senate Consideration of Treaty Document 102-38. (1992) <a href="https://www.congress.gov/treaty-document/102nd-congress/38">https://www.congress.gov/treaty-document/102nd-congress/38</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>The treaties are:</p><ul><li><p>Basel Convention on the Control of Transboundary Movements of Hazardous Wastes and Their Disposal</p></li><li><p>Convention on Biological Diversity</p></li><li><p>Kyoto Protocol to the UNFCCC</p></li><li><p>Stockholm Convention on Persistent Organic Pollutants</p></li></ul></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Ed King. <em>Paris agreement &#8216;does not need Senate approval&#8217; say officials. </em>Climate Home News. (2015) <a href="https://www.climatechangenews.com/2015/12/15/paris-agreement-does-not-need-senate-approval-say-officials/">https://www.climatechangenews.com/2015/12/15/paris-agreement-does-not-need-senate-approval-say-officials/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>Robert A. Wampler. <em>The U.S. and Climate Change: Washington&#8217;s See-Saw on Global Leadership. </em>George Washington University: National Security Archive. (2018) <a href="https://nsarchive.gwu.edu/briefing-book/environmental-diplomacy/2018-09-24/us-climate-change-washingtons-see-saw-global-leadership">https://nsarchive.gwu.edu/briefing-book/environmental-diplomacy/2018-09-24/us-climate-change-washingtons-see-saw-global-leadership</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>Robert A. Wampler. <em>U.S. Climate Change Policy in the 1980s. </em>George Washington University: National Security Archive. (2015) <a href="https://nsarchive2.gwu.edu/NSAEBB/NSAEBB536-Reagan-Bush-Recognized-Need-for-US-Leadership-on-Climate-Change-in-1980s/">https://nsarchive2.gwu.edu/NSAEBB/NSAEBB536-Reagan-Bush-Recognized-Need-for-US-Leadership-on-Climate-Change-in-1980s/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p><em>Signing the Kyoto Protocol. </em>Clinton Presidential Library. (1997) <a href="https://clinton.presidentiallibraries.us/exhibits/show/green-building/kyoto-protocol">https://clinton.presidentiallibraries.us/exhibits/show/green-building/kyoto-protocol</a>.</p><p><em>Text of a Letter from the President to Senators Hagel, Helms, Craig, and Roberts. </em>George W. Bush White House Archives. (2001) <a href="https://georgewbush-whitehouse.archives.gov/news/releases/2001/03/20010314.html">https://georgewbush-whitehouse.archives.gov/news/releases/2001/03/20010314.html</a>.</p><p>Tanya Somanader. <em>President Obama: The United States Formally Enters the Paris Agreement. </em>Obama White House Archives. (2016) <a href="https://obamawhitehouse.archives.gov/blog/2016/09/03/president-Obama-United-states-formally-enters-Paris-agreement">https://obamawhitehouse.archives.gov/blog/2016/09/03/president-Obama-United-states-formally-enters-Paris-agreement</a>.</p><p>Michael R. Pompeo. <em>On the U.S. Withdrawal from the Paris Agreement. </em>United States Department of State Archives. (2019) <a href="https://2017-2021.state.gov/on-the-u-s-withdrawal-from-the-paris-agreement/">https://2017-2021.state.gov/on-the-u-s-withdrawal-from-the-paris-agreement/</a>.</p><p>Antony J. Blinken. <em>The United States Officially Rejoins the Paris Agreement. </em>United States Department of State. (2021) <a href="https://www.state.gov/the-united-states-officially-rejoins-the-paris-agreement/">https://www.state.gov/the-united-states-officially-rejoins-the-paris-agreement/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p><em>EPA's Budget and Spending</em>. United States Environmental Protection Agency. (Accessed April 29, 2024) <a href="https://www.epa.gov/planandbudget/budget">https://www.epa.gov/planandbudget/budget</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Matthew L. Wald. <em>E.P.A. Says It Will Change Rules Governing Industrial Pollution. </em>New York Times. (2002) <a href="https://www.nytimes.com/2002/11/23/us/epa-says-it-will-change-rules-governing-industrial-pollution.html">https://www.nytimes.com/2002/11/23/us/epa-says-it-will-change-rules-governing-industrial-pollution.html</a>.</p><p>Gavin Bade. <em>EPA loosens Clean Air Act rules for major pollution sources. </em>Utility Dive. (2018) <a href="https://www.utilitydive.com/news/epa-loosens-clean-air-act-rules-for-major-pollution-sources/515661/">https://www.utilitydive.com/news/epa-loosens-clean-air-act-rules-for-major-pollution-sources/515661/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>Laignee Barron. <em>The EPA&#8217;s Website After a Year of Climate Change Censorship. </em>Time. (2018) <a href="https://time.com/5075265/epa-website-climate-change-censorship/">https://time.com/5075265/epa-website-climate-change-censorship/</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[A Narrative History of Environmentalism's Partisanship]]></title><description><![CDATA[This post describes the history of how particular partisan alliances were made involving the environmental movement between 1980 and 2008. Since individual decisions are central to understanding why this happened, this history is best presented as a narrative following the key people and organizations.]]></description><link>https://blog.aiimpacts.org/p/a-narrative-history-of-environmentalisms</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/a-narrative-history-of-environmentalisms</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Tue, 14 May 2024 16:49:43 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/4fa2c20d-72ca-467c-b6d6-d2c6663cfbc7_1024x1024.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is the second in a sequence of four posts taken from my recent report: <a href="http://aiimpacts.org/wp-content/uploads/2023/04/Why-Did-Environmentalism-Become-Partisan-1.pdf">Why Did Environmentalism Become Partisan?</a></em></p><p><em>Many of the specific claims made here are investigated in the full report. If you want to know more about how fossil fuel companies&#8217; campaign contributions, the partisan lean of academia, or newspapers&#8217; reporting on climate change have changed since 1980, the information is there.</em></p><h2>Introduction</h2><p>Environmentalism in the United States today is unusually partisan, compared to other issues, countries, or even the United States in the 1980s. This contingency suggests that the explanation centers on the choices of individual decision makers, not on broad structural or ideological factors that would be consistent across many countries and times.</p><p>This post describes the history of how particular partisan alliances were made involving the environmental movement between 1980 and 2008. Since individual decisions are central to understanding why this happened, this history is best presented as a narrative following the key people and organizations.</p><h2>Environmentalism in the Reagan Era</h2><p>In the wake of the New Deal, the Republican Party acquiesced to the government having a larger role in society than it had had before the Great Depression.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Republican presidents would sometimes support increased government spending and regulation. This is apparent in environmental policy: Nixon was involved in several major pieces of environmental legislation and created the EPA in the executive branch.</p><p>The election of Reagan in 1980 reoriented the Republican Party. It would now advocate for a smaller government: less (non-military) spending, lower taxes, and less regulation. The free market would provide many of the services that had previously been done by the government. Thatcher&#8217;s election as Prime Minister had similar results for the Conservative Party in the UK.</p><p>This might seem like it would cause a deep ideological conflict: Environmentalists advocated for regulations on private enterprise and international cooperation on policy, while the Republican Party preferred private and local action. However, this is not what we observe with either Reagan or Thatcher.</p><p>The conservative leaders in the US and UK supported environmentalism, even when it involved international regulations. The clearest example of this is the Montreal Protocol on Substances That Deplete the Ozone Layer in 1988. Reagan described it as:&nbsp;</p><blockquote><p>The Montreal protocol is a model of cooperation. It is a product of the recognition and international consensus that ozone depletion is a global problem, both in terms of its causes and its effects. The protocol is the result of an extraordinary process of scientific study, negotiations among representatives of the business and environmental communities, and international diplomacy. It is a monumental achievement.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a>&nbsp;</p></blockquote><p>The US Senate ratified the Montreal Protocol unanimously.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>Reagan and Thatcher also specifically supported international regulation to combat climate change. Thatcher was the first head of government to talk about climate change at the UN, in 1989, and called for an international conference on climate change in 1992 (The Earth Summit in Rio de Janeiro).<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> The International Panel on Climate Change (IPCC) creates the reports summarizing the scientific consensus about climate change, and its &#8220;principal architect was the conservative Reagan administration.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> In 1992, the U.N. Framework Convention of Climate Change (UNFCCC), the result of the Rio Summit, had the support of the Bush Sr. administration. The U.S. Senate decided that it was popular enough to not need a roll-call vote.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a></p><p>These actions of conservative leaders also translated into broad popular support, including among Republicans. The 1980s saw increasing concern about the environment and a decreasing partisan gap. Republicans only became anti-environmentalist after 1990.</p><p>Anti-environmentalism is not the natural consequence of the small government ideology of Reagan and Thatcher. It only entered the US Republican Party a decade later, and the UK Conservative Party has continued to support environmentalism.</p><h2>Environmentalists, Climate Scientists, &amp; Democratic Politicians</h2><p>The earliest partisan alliances involving climate change began in Congressional hearings in the 1980s.</p><p>The Reagan administration entered office promising to reverse most of the energy policies of the Carter administration and dramatically shrink the Department of Energy, which had just been created in 1977.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> One of the programs cut was a newly established center for climate research.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> As an undergraduate student, Al Gore had taken classes in climate science from Roger Revelle, one of the first people to study global warming. In the House of Representatives, Gore led Congressional hearings against these particular cuts (which were partially reversed), and continued to be very involved whenever climate was an issue in Washington.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> Climate policy at this time was still bipartisan, and the Reagan administration was open to government action on climate.</p><p>The environmental movement also became increasingly interested in climate change in the late 1980s, particularly after the summer of 1988. This summer saw severe drought across most of the U.S., low enough water in the Mississippi River to hinder barge traffic, heat waves, major fires in Yellowstone, and a Category 5 hurricane in the fall. In and after a Congressional hearing on the climate, James Hansen of NASA claimed that he was &#8220;99 percent certain&#8221; that &#8220;the greenhouse effect is here.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> Many other climate scientists did not believe that the evidence was that strong yet and disliked his combative tone. A few publicly rebuked him.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a></p><p>Focusing on climate change provided a way to unify the disparate concerns of the environmental movement, including air &amp; water pollution, habitat conservation, recycling, and energy production. The historian of science Spencer Weart describes this transition as:</p><blockquote><p>The environmental movement, which had found only occasional interest in global warming, now took it up as a main cause. Groups that had other reasons for preserving tropical forests, promoting energy conservation, slowing population growth, or reducing air pollution could make common cause as they offered their various ways to reduce emissions of CO<sub>2</sub>.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a></p></blockquote><p>An unusually explicit statement of this strategy comes from Senator Timothy Wirth (D-CO):</p><blockquote><p>What we've got to do in energy conservation is try to ride the global warming issue. Even if the theory of global warming is wrong, to have approached global warming as if it is real means energy conservation, so we will be doing the right thing anyway in terms of economic policy and environmental policy.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a></p></blockquote><p>In the late 1980s and early 1990s, environmentalists became increasingly focused on climate science, and both environmentalists and climate scientists formed political alliances with Democratic politicians.</p><h2>The Clinton-Gore Administration and the BTU Tax</h2><p>In 1992, Bill Clinton selected Al Gore as his vice presidential candidate and secured the endorsements of environmental organizations like the Sierra Club that had mostly stayed above the partisan fray.</p><p>One of the Clinton&#8217;s administration&#8217;s early legislative goals was a tax on energy, measured in British thermal units (BTUs).<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> While this is sometimes remembered as an attempt at a carbon tax, it taxed energy rather than carbon dioxide. Solar, wind, and geothermal power production were exempted, but nuclear and hydroelectricity were not. This tax proved extremely unpopular. To shore up support, the Clinton administration agreed to more exemptions for particular industries, but this diminished what the bill hoped to accomplish, did not improve its popularity in Congress, and encouraged even more groups to request exemptions.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> The broad-based BTU tax was abandoned, and replaced with a much weaker tax on gasoline. Congressmen who had supported the BTU tax suffered politically in the midterms.&nbsp;</p><p>In 1994, Republicans won control of the House of Representatives for the first time in 40 years. The BTU tax was not the only significant issue: the NRA also organized against an assault rifle ban and Newt Gingrich innovated by using a national strategy instead of focusing on individual races. Opposing a climate policy was one thing that helped propel Republicans into power in Congress.</p><h2>Fossil Fuel Companies, Climate Skeptics, &amp; Conservative Think Tanks</h2><p>The fossil fuel industry opposed government action on climate change. Significant reductions in greenhouse gas emissions would completely undermine their business model, forcing them to transition to a different industry (renewable energy) or dramatically lose market share. In 1989, a group of fossil fuel and manufacturing companies founded the Global Climate Coalition to oppose climate policy that they claimed would disrupt the American economy.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a> The Global Climate Coalition spent tens of millions of dollars in ad campaigns and contributions to politicians before it disbanded in 2001.</p><p>At the time, it was not obvious that the industry lobbying would overwhelmingly favor Republicans. There had not previously been a strong tendency for fossil fuel companies to support Republicans.</p><p>It is not too surprising that the industry lobby ended up favoring Republicans. Republicans were the more business-friendly party. There was already a small bias for campaign contributions in that direction and oil is more concentrated in Republican-leaning states. The Gulf War in 1990 might have associated the oil industry with the Republican Party, although the war proved broadly popular.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a> Academics, probably including climate scientists, were somewhat more likely to lean to the left, although not nearly as much as they do today. Congressional Republicans had been somewhat more likely to oppose environmental legislation than Congressional Democrats for several decades. But then there was an abrupt change in the early 1990s, and this difference dramatically increased.</p><p>The Global Climate Coalition found willing allies among conservative think tanks. These think tanks would accept funding from the fossil fuel industry to hire skeptical climate scientists or experts from other fields who were skeptical of climate change. They would publish policy studies, newsletter articles, and press releases that cast doubt on conventional climate science and opposed climate policy proposals. These think tanks and skeptics were successful at reframing the climate debate and creating the &#8220;non-problematicity&#8221; of global warming among conservatives.<em> </em>The most common claims by conservative think tanks were that &#8220;the scientific evidence for global warming is highly uncertain&#8221; and &#8220;proposed action would harm the national economy.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a></p><p>It is not clear to me whether the causal relationship mostly points from conservative think tanks to Republican congressmen or vice versa. The first climate skeptic publication by a major conservative think tank was in 1991, before Clinton &amp; Gore were elected or Gingrich became Speaker of the House. However, there were initially only single digits of publications per year, mostly by a single think tank: the Marshall Institute. The publications did not become common or widespread until 1996-1997. Between 1991 and 1996, most conservative think tanks did not yet have a public position on climate change.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hVRV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hVRV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 424w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 848w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 1272w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hVRV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png" width="1150" height="387" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:387,&quot;width&quot;:1150,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:91819,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hVRV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 424w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 848w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 1272w, https://substackcdn.com/image/fetch/$s_!hVRV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4144a133-455c-41dd-bdc0-eaa2d5e715e3_1150x387.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 1: </strong>Type and Year of Publication of Documents on Global Warming Circulated by Major Conservative Think Tanks. Reprinted from McCright &amp; Dunlap (2000).</em></figcaption></figure></div><p>After the Republican Party led by Gingrich won the Midterm elections in 1994, the number of Congressional hearings about climate change decreased. When there were hearings, Congress would invite similar numbers of conventional climate scientists and climate change skeptics.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a> Congress began to treat this as an active scientific debate and the media followed suit,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-21" href="#footnote-21" target="_self">21</a> even though both had previously predominantly presented the scientific consensus.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-MRp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-MRp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 424w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 848w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 1272w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-MRp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png" width="522" height="435.6641221374046" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:656,&quot;width&quot;:786,&quot;resizeWidth&quot;:522,&quot;bytes&quot;:76497,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-MRp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 424w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 848w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 1272w, https://substackcdn.com/image/fetch/$s_!-MRp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3eb8d952-95f0-4caf-a03e-8cf14dc753c3_786x656.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 2: </strong>Natural Scientists' Testimonies Delivered Each Year by Climate Change Skeptics and Conventional Scientists in Congressional Hearings About Global Warming. The vertical axis is the percentage of testimonies and the number of testimonies is printed above each column. Reprinted from McCright &amp; Dunlap (2003).</em></figcaption></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ORKc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ORKc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 424w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 848w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 1272w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ORKc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png" width="510" height="415.61514195583595" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/82f97337-5d7e-424d-9d72-733764f5876e_951x775.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:775,&quot;width&quot;:951,&quot;resizeWidth&quot;:510,&quot;bytes&quot;:124813,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ORKc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 424w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 848w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 1272w, https://substackcdn.com/image/fetch/$s_!ORKc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82f97337-5d7e-424d-9d72-733764f5876e_951x775.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 3: </strong>Number of Global Warming-Related Articles Citing one of five &#8220;Elite&#8221; Climate Scientists and/or one of five Climate Change Skeptics as an Information Source. Reprinted from McCright &amp; Dunlap (2003).</em></figcaption></figure></div><h2>Debates Over the Kyoto Protocol</h2><p>The Earth Summit in Rio de Janeiro in 1992 which had produced the UN Framework Convention on Climate Change (UNFCCC) also called for a future summit, in Kyoto, that would impose limits on countries&#8217; greenhouse gas emissions.</p><p>While the UNFCCC had gotten broad bipartisan support under the Bush Sr. administration, the politics of climate change had changed dramatically since then. The debate over the Kyoto Protocol would see the last major bipartisan actions on climate change and the beginning of substantial partisanship among the public.</p><p>Before the summit, the US Senate unanimously passed the Byrd-Hagel Resolution, which declared that it would not support any treaty that imposed restrictions on developed countries (like the US) but not developing countries.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-22" href="#footnote-22" target="_self">22</a></p><p>The summit was very contentious and the negotiations almost collapsed. On the last day, Vice President Gore flew to Kyoto to save the agreement. The resulting Kyoto Protocol did not impose any restrictions on the greenhouse gas emissions of developing countries. President Clinton signed the treaty, but did not even submit it to the Senate for consideration.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-23" href="#footnote-23" target="_self">23</a> To win, it would need 2/3 of the Senate, who had just unanimously opposed it.</p><p>I am uncertain whether this should be thought of more as a case where the Senate overconstrained the international negotiating position of the presidential administration or more as a case of the administration ignoring the advice of the Senate.</p><p>Before and during the summit, the Clinton administration ran a media campaign to build public support for the resulting treaty. There was a massive increase in media coverage, most of which was aligned with conventional climate science. Conservative think tanks also dramatically increased their production of skeptical media. A pair of surveys conducted before and after this debate found that it did make people more aware of climate change as an issue. A majority of people believed that climate change was going to happen, was going to be bad, and that the government should limit air pollution to address it. The overall percentages of people who supported these positions did not change as a result of the debate. There were underlying shifts as strong Democrats came to increasingly support the administration&#8217;s policy and strong Republicans came to oppose it.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-24" href="#footnote-24" target="_self">24</a></p><p>The Kyoto Protocol was a failed attempt at climate policy in the United States that directly increased partisanship.</p><h2>Continued Increases in Partisanship</h2><p>Partisanship continued to increase after the debate over the Kyoto Protocol.&nbsp;</p><p>Al Gore ran for president in 2000. Although he was strongly associated with climate change by this point, multiple sources claimed that environmentalism was not a major issue in this election.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-25" href="#footnote-25" target="_self">25</a> Bush Jr. became president instead, declared that the U.S. would not fulfill its obligations under the Kyoto Protocol, and reduced funding to climate science.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-26" href="#footnote-26" target="_self">26</a></p><p>During this time, some of the structural factors that might have been contributing to rising partisanship ended. The Global Climate Coalition disbanded in 2001. Some of the companies which had been involved accepted climate change, while others continued to promote skepticism.&nbsp; The mainstream media stopped presenting both sides of the debate in 2003-2004. Nevertheless, the partisan gap continued to grow.</p><p>In 2006, Gore released a climate change documentary titled <em>An Inconvenient Truth</em>. This did not change overall public opinion.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-27" href="#footnote-27" target="_self">27</a> Instead, partisanship continued to increase: more Democrats were becoming climate activists, while Republicans were becoming increasingly skeptical.</p><p>There were still prominent Republicans who supported policies to counteract climate change. Governor Arnold Schwarzenegger introduced a cap-and-trade system for California,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-28" href="#footnote-28" target="_self">28</a> while Senator John McCain co-sponsored a bill that would create a similar system for the country.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-29" href="#footnote-29" target="_self">29</a> However, an increasing number of Republicans became increasingly opposed to environmentalism, and the environmental movement became increasingly tied to the Democratic Party.</p><p>Subsequent decisions by both parties, and the environmental movement itself, continued to contribute to rising partisanship on environmental issues in the United States.</p><h2>Conclusion</h2><p>Broad structural and ideological differences do not explain the partisanship of environmentalism. During the Reagan and Bush Sr. administrations, the Republican Party did support environmentalism, including international agreements on climate change, despite its small-government orientation on most issues. The Republican Party did not significantly change its ideology between the 1980s and 2000s. The subsequent partisanship in environmentalism then cannot be explained by foundational ideological differences between Democrats and Republicans. Instead, the explanation involves a history of alliances made by particular decision makers.</p><p>The first alliance made was between environmentalists, climate scientists, and Congressional Democrats during Congressional hearings in the 1980s. The key figure here was Al Gore. Environmentalists seemed to accept the usefulness of this alliance and did not seriously try to find a similarly prominent or rising Republican politician to ally with as well.</p><p>The second alliance made was between fossil fuel companies, climate skeptics, and conservative think tanks, starting around 1990. The industry organized into the Global Climate Coalition in 1989 and convinced the Marshall Institute to begin publishing climate skepticism in 1991. For a few more years, this alliance was not complete: most conservative think tanks were still neutral on climate change, and environmentalists might have been able to convince some of them to support their cause.</p><p>When environmentalist-aligned Democrats were in political power in the 1990s, they made several policy proposals that were deeply flawed. A tax proposed in 1993 taxed energy produced, not carbon dioxide emitted, and included arbitrary exemptions. The Kyoto Protocol in 1997 contained terms that the Senate had previously rejected unanimously. These flawed policy proposals made it easier for Republicans like Gingrich or Bush Jr. to rally the public against them &#8211; and environmentalism more broadly. Subsequent decisions, on both sides of the aisle, continued to reinforce the trend towards increasing partisanship.</p><p>This partisanship could have been avoided, if various decision makers had made different choices about what alliances to form or not form. Environmentalism is not partisan in many other countries, including in highly partisan countries like South Korea&nbsp; or France. The resulting partisanship was bad for the environmental movement. As partisanship increased in the 1990s and early 2000s, environmentalism saw flat or falling support, fewer major legislative accomplishments, and fluctuating executive actions.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Prior to the Great Depression, there was less disagreement between the two parties about what the size of the government should be. The New Deal saw Democrats dramatically increase the size and role of the governments, which Republicans initially opposed.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p><em>Montreal Protocol on Substances that Deplete the Ozone Layer. </em>U.S. Department of State. (Accessed April 17, 2024) <a href="https://www.state.gov/key-topics-office-of-environmental-quality-and-transboundary-issues/the-montreal-protocol-on-substances-that-deplete-the-ozone-layer/">https://www.state.gov/key-topics-office-of-environmental-quality-and-transboundary-issues/the-montreal-protocol-on-substances-that-deplete-the-ozone-layer/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p><em>Montreal Protocol on Substances that Deplete the Ozone Layer. </em>Senate Consideration of Treaty Document 100-10. (1988) <a href="https://www.congress.gov/treaty-document/100th-congress/10">https://www.congress.gov/treaty-document/100th-congress/10</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Margaret Thatcher. <em>Speech to United Nations General Assembly (Global Environment). </em>(1989) <a href="https://www.margaretthatcher.org/document/107817">https://www.margaretthatcher.org/document/107817</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Spencer Weart. <em>The Discovery of Global Warming. </em>Government: The View from Washington.<em> </em>(Accessed Feb 2024) <a href="https://history.aip.org/climate/Govt.htm">https://history.aip.org/climate/Govt.htm</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>The UNFCCC was ratified using a division vote, in which Senators stand for &#8220;yea&#8221; and &#8220;nay&#8221; and the presiding officer counts the number of Senators standing for each. The result of the vote is not recorded other than whether it passed. Treaties require 2/3 support of the Senate to be ratified, so it had to have had significant bipartisan support. Typically, division votes and voice votes are used when the result of the vote is not in doubt beforehand.</p><p><em>About Voting. </em>U.S. Senate. (Accessed March 22, 2024) <a href="https://www.senate.gov/about/powers-procedures/voting.htm">https://www.senate.gov/about/powers-procedures/voting.htm</a>.</p><p><em>United Nations Framework Convention on Climate Change. </em>Senate Consideration of Treaty Document 102-38. (1992) <a href="https://www.congress.gov/treaty-document/102nd-congress/38">https://www.congress.gov/treaty-document/102nd-congress/38</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p><em>Republican Party Platform of 1980.</em> &#167; Energy. <a href="https://www.presidency.ucsb.edu/documents/republican-party-platform-1980">https://www.presidency.ucsb.edu/documents/republican-party-platform-1980</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p><em>Climate Change in the 1970s. </em>American Institute of Physics. (Accessed March 29, 2024) <a href="https://history.aip.org/history/exhibits/climate-change-in-the-70s/">https://history.aip.org/history/exhibits/climate-change-in-the-70s/</a>.</p><p>Spencer Weart. <em>The Discovery of Global Warming. </em>Government: The View from Washington.<em> </em>(Accessed Feb 2024) <a href="https://history.aip.org/climate/Govt.htm">https://history.aip.org/climate/Govt.htm</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Roger A. Pielke Jr. <em>Policy history of the US Global Change Research Program: Part I. Administrative development</em>. Global Environmental Change <strong>10</strong>. (2000) p. 9-25. <a href="https://sciencepolicy.colorado.edu/admin/publication_files/2000.09.pdf">https://sciencepolicy.colorado.edu/admin/publication_files/2000.09.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Philip Shabecoff. <em>Global Warming Has Begun, Expert Tells Senate. </em>New York Times. (1988) <a href="https://www.nytimes.com/1988/06/24/us/global-warming-has-begun-expert-tells-senate.html">https://www.nytimes.com/1988/06/24/us/global-warming-has-begun-expert-tells-senate.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Richard A. Kerr. <em>Hansen vs. the World on the Greenhouse Threat. </em>Science <strong>244</strong>. (1989) <a href="https://www.science.org/doi/abs/10.1126/science.244.4908.1041">https://www.science.org/doi/abs/10.1126/science.244.4908.1041</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>Spencer Weart. <em>The Discovery of Global Warming. </em>The Public and Climate Change Since 1980.<em> </em>(Accessed Feb 2024) <a href="https://history.aip.org/climate/public2.htm">https://history.aip.org/climate/public2.htm</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>Scott Alexander has also noticed this transition and described it as:</p><blockquote><p>It feels almost like some primitive barter system has been converted to a modern economy, with tons of CO<sub>2</sub> emission as the universal interchangeable currency that can be used to put a number value on all environmental issues.&nbsp;</p></blockquote><p>Scott Alexander. <em>What Happened To 90s Environmentalism? </em>Slate Star Codex. (2019)&nbsp; <a href="https://slatestarcodex.com/2019/01/01/what-happened-to-90s-environmentalism/">https://slatestarcodex.com/2019/01/01/what-happened-to-90s-environmentalism/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>Roger A. Pielke Jr., Roberta Klein, &amp; Daniel Sarewitz. <em>Turning the Big Knob: An Evaluation of the Use of Energy Policy to Modulate Future Climate Impacts. </em>Energy and Environment, <strong>11</strong>. (2000) p. 255-276. <a href="https://sciencepolicy.colorado.edu/about_us/meet_us/roger_pielke/knob/text.html">https://sciencepolicy.colorado.edu/about_us/meet_us/roger_pielke/knob/text.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p><em>Some history. </em>Carbon Tax Center. (Accessed March 29, 2024) <a href="https://www.carbontax.org/some-history/">https://www.carbontax.org/some-history/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Dawn Erlandson. <em>The BTU Tax Experience: What Happened and Why It Happened. </em>Pace Environmental Law Review <strong>12.1</strong>. (1994) <a href="https://digitalcommons.pace.edu/cgi/viewcontent.cgi?article=1528&amp;context=pelr">https://digitalcommons.pace.edu/cgi/viewcontent.cgi?article=1528&amp;context=pelr</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p><em>Global Climate Coalition.</em> Source Watch. (Accessed March 29, 2024) <a href="https://www.sourcewatch.org/index.php/Global_Climate_Coalition">https://www.sourcewatch.org/index.php/Global_Climate_Coalition</a>.</p><p><em>GCC's position on the climate issue.</em> Global Climate Coalition. (Archive: Feb 9, 1999) <a href="http://web.archive.org/web/19990209102342/http://www.globalclimate.org/MISSION.htm">http://web.archive.org/web/19990209102342/http://www.globalclimate.org/MISSION.htm</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p>David W. Moore. <em>Americans Believe U.S. Participation in Gulf War a Decade Ago Worthwhile. </em>Gallup (2001) <a href="https://news.gallup.com/poll/1963/americans-believe-us-participation-gulf-war-decade-ago-worthwhile.aspx">https://news.gallup.com/poll/1963/americans-believe-us-participation-gulf-war-decade-ago-worthwhile.aspx</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p>Aaron M. McCright &amp; Riley E. Dunlap. <em>Challenging Global Warming as a Social Problem: An Analysis of the Conservative Movement's Counter-Claims. </em>Social Problems <strong>47.4</strong>. (2000) p. 499-522. <a href="https://www.researchgate.net/publication/237371278_Challenging_Global_Warming_as_a_Social_Problem_An_Analysis_of_the_Conservative_Movement%27s_Counter-Claims">https://www.researchgate.net/publication/237371278_Challenging_Global_Warming_as_a_Social_Problem_An_Analysis_of_the_Conservative_Movement%27s_Counter-Claims</a>.</p><p>Peter J. Jacques, Riley E. Dunlap, &amp; Mark Freeman. <em>The organisation of denial: Conservative think tanks and environmental scepticism. </em>Environmental Politics. (2008) p. 349-385. <a href="https://www.tandfonline.com/doi/full/10.1080/09644010802055576">https://www.tandfonline.com/doi/full/10.1080/09644010802055576</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p>Aaron M. McCright &amp; Riley E. Dunlap. <em>Defeating Kyoto: The Conservative Movement's Impact on U.S. Climate Change Policy. </em>Social Problems <strong>50.3</strong>. (2003), p. 348-373. <a href="https://www.researchgate.net/publication/228594257_Defeating_Kyoto_The_Conservative_Movement%27s_Impact_on_US_Climate_Change_Policy">https://www.researchgate.net/publication/228594257_Defeating_Kyoto_The_Conservative_Movement%27s_Impact_on_US_Climate_Change_Policy</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-21" href="#footnote-anchor-21" class="footnote-number" contenteditable="false" target="_self">21</a><div class="footnote-content"><p>The newspapers included are: <em>Wall Street Journal</em>, <em>USA Today</em>, <em>New York Times</em>, <em>Los Angeles Times</em>, <em>Washington Post</em>, <em>Chicago Tribune</em>, and <em>Newsday</em>.</p><p>Aaron M. McCright &amp; Riley E. Dunlap. <em>Defeating Kyoto: The Conservative Movement's Impact on U.S. Climate Change Policy. </em>Social Problems <strong>50.3</strong>. (2003) p. 348-373. <a href="https://www.researchgate.net/publication/228594257_Defeating_Kyoto_The_Conservative_Movement's_Impact_on_US_Climate_Change_Policy">https://www.researchgate.net/publication/228594257_Defeating_Kyoto_The_Conservative_Movement's_Impact_on_US_Climate_Change_Policy</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-22" href="#footnote-anchor-22" class="footnote-number" contenteditable="false" target="_self">22</a><div class="footnote-content"><p><em>A resolution expressing the sense of the Senate regarding the conditions for the United States becoming a signatory to any international agreement on greenhouse gas emissions under the United Nations Framework Convention on Climate Change. </em>Senate Resolution 98. (1997) <a href="https://www.congress.gov/bill/105th-congress/senate-resolution/98">https://www.congress.gov/bill/105th-congress/senate-resolution/98</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-23" href="#footnote-anchor-23" class="footnote-number" contenteditable="false" target="_self">23</a><div class="footnote-content"><p><em>United States Signs the Kyoto Protocol. </em>Bureau of Oceans and International Environmental and Scientific Affairs. (1998) <a href="https://1997-2001.state.gov/global/global_issues/climate/fs-us_sign_kyoto_981112.html">https://1997-2001.state.gov/global/global_issues/climate/fs-us_sign_kyoto_981112.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-24" href="#footnote-anchor-24" class="footnote-number" contenteditable="false" target="_self">24</a><div class="footnote-content"><p>Jon A. Krosnick, Allyson L. Holbrook, &amp; Penny S. Visser. <em>The impact of the fall 1997 debate about global warming on American public opinion. </em>Public Understanding of Science <strong>9</strong>. (2000) p. 239-260. <a href="https://citeseerx.ist.psu.edu/document?repid=rep1&amp;type=pdf&amp;doi=1ecba8f2535dd16fe855168cfeb35592e36259be">https://citeseerx.ist.psu.edu/document?repid=rep1&amp;type=pdf&amp;doi=1ecba8f2535dd16fe855168cfeb35592e36259be</a>.</p><p>Steven Kull. <em>Americans on Global Warming: A Study of U.S. Public Attitudes. </em>Program on International Policy Attitudes. (1998) <a href="https://publicconsultation.org/wp-content/uploads/2020/09/GlobalWarming_1998.pdf">https://publicconsultation.org/wp-content/uploads/2020/09/GlobalWarming_1998.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-25" href="#footnote-anchor-25" class="footnote-number" contenteditable="false" target="_self">25</a><div class="footnote-content"><p>Spencer Weart. <em>The Discovery of Global Warming. </em>Government: The View from Washington.<em> </em>(Accessed Feb 2024) <a href="https://history.aip.org/climate/Govt.htm">https://history.aip.org/climate/Govt.htm</a>.</p><p>Gerald M. Pomper. <em>The 2000 Presidential Election: Why Gore Lost. </em>Political Science Quarterly <strong>116.2</strong>. (2001) p. 201. <a href="https://www.uvm.edu/~dguber/POLS125/articles/pomper.htm">https://www.uvm.edu/~dguber/POLS125/articles/pomper.htm</a>.</p><p>Thomas E. Mann. <em>Reflections on the 2000 U.S. Presidential Election. </em>Brookings. (2001) <a href="https://www.brookings.edu/articles/reflections-on-the-2000-u-s-presidential-election/">https://www.brookings.edu/articles/reflections-on-the-2000-u-s-presidential-election/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-26" href="#footnote-anchor-26" class="footnote-number" contenteditable="false" target="_self">26</a><div class="footnote-content"><p><em>Text of a Letter from the President to Senators Hagel, Helms, Craig, and Roberts. </em>George W. Bush White House Archives. (2001) <a href="https://georgewbush-whitehouse.archives.gov/news/releases/2001/03/20010314.html">https://georgewbush-whitehouse.archives.gov/news/releases/2001/03/20010314.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-27" href="#footnote-anchor-27" class="footnote-number" contenteditable="false" target="_self">27</a><div class="footnote-content"><p>Deborah Lynn Guber. <em>A Cooling Climate for Change? Party Polarization and the Politics of Global Warming. </em>American Behavioral Scientist <strong>57.1</strong>. (2013) p. 93&#8211;115. <a href="https://cssn.org/wp-content/uploads/2020/12/A-Cooling-Climate-for-Change-Party-Polarization-and-the-Politics-of-Global-Warming-Deborah-Guber.pdf">https://cssn.org/wp-content/uploads/2020/12/A-Cooling-Climate-for-Change-Party-Polarization-and-the-Politics-of-Global-Warming-Deborah-Guber.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-28" href="#footnote-anchor-28" class="footnote-number" contenteditable="false" target="_self">28</a><div class="footnote-content"><p><em>California&#8217;s Cap-and-Trade Program: Frequently Asked Questions. </em>Legislative Analyst&#8217;s Office: The California Legislature&#8217;s Nonpartisan Fiscal and Policy Advisor (2023) <a href="https://lao.ca.gov/Publications/Report/4811">https://lao.ca.gov/Publications/Report/4811</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-29" href="#footnote-anchor-29" class="footnote-number" contenteditable="false" target="_self">29</a><div class="footnote-content"><p>&nbsp;<em>Climate Stewardship Act.</em> S.139. (2003) <a href="https://www.congress.gov/bill/108th-congress/senate-bill/139/all-info">https://www.congress.gov/bill/108th-congress/senate-bill/139/all-info</a>.</p><p>Marianne Lavelle. <em>John McCain&#8217;s Climate Change Legacy. </em>Inside Climate News. (2018) <a href="https://insideclimatenews.org/news/26082018/john-mccain-climate-change-leadership-senate-cap-trade-bipartisan-lieberman-republican-campaign/">https://insideclimatenews.org/news/26082018/john-mccain-climate-change-leadership-senate-cap-trade-bipartisan-lieberman-republican-campaign/</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Environmentalism in the United States Is Unusually Partisan]]></title><description><![CDATA[Environmentalism in the United States is unusually partisan, compared to other issues, compared to other countries, and compared to the United States itself at other times.]]></description><link>https://blog.aiimpacts.org/p/environmentalism-in-the-united-states</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/environmentalism-in-the-united-states</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Mon, 13 May 2024 21:22:14 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/bdc79951-a6fa-4786-882d-45c520fca78a_838x682.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is the first in a sequence of four posts taken from my recent report: <a href="http://aiimpacts.org/wp-content/uploads/2023/04/Why-Did-Environmentalism-Become-Partisan-1.pdf">Why Did Environmentalism Become Partisan?</a></em></p><p></p><h2>Introduction</h2><p>In the United States, environmentalism is extremely partisan.</p><p>It might feel like this was inevitable. Caring about the environment, and supporting government action to protect the environment, might seem like they are inherently left-leaning. Partisanship has increased for many issues, so it might not be surprising that environmentalism became partisan too.</p><p>Looking at the public opinion polls more closely makes it more surprising. Environmentalism in the United States is unusually partisan, compared to other issues, compared to other countries, and compared to the United States itself at other times.&nbsp;</p><p>The partisanship of environmentalism was not inevitable.</p><h2>Compared to Other Issues</h2><p>Environmentalism is one of the, if not the, most partisan issues in the US.</p><p>The most recent data demonstrating this comes from a Gallup poll from 2023.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Of the 24 issues surveyed, &#8220;Protecting the Environment Has Priority Over Energy Development&#8221; was tied for the largest partisan gap with &#8220;Government Should Ensure That Everyone Has Healthcare.&#8221; Of the top 5 most partisan issues, 3 were related to environmentalism. The amount this gap has widened since 2003 is also above average for these environmental issues.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!U4iV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!U4iV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 424w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 848w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 1272w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!U4iV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png" width="590" height="1294.4880952380952" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1843,&quot;width&quot;:840,&quot;resizeWidth&quot;:590,&quot;bytes&quot;:699977,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!U4iV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 424w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 848w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 1272w, https://substackcdn.com/image/fetch/$s_!U4iV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dd1f33-30f8-4025-a863-e67d351ca833_840x1843.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 1: </strong>The percentages of Republicans and Democrats who agree with each statement shown, 2003-2023. Reprinted from Gallup (2023).</em></figcaption></figure></div><p>Pew also has some recent relevant data.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> They ask whether 21 particular policies &#8220;should be a top priority for the president and Congress to address this year.&#8221; The largest partisan gap is for &#8220;protecting the environment&#8221; (47 p.p.), followed by &#8220;dealing with global climate change&#8221; (46 p.p.). These are ten percentage points higher than the next most partisan priority. These issues are less specific than the ones Gallup asked about, and so might not reveal as much of the underlying partisanship. For example, most Democrats and most Republicans agree that strengthening the economy is important, but they might disagree about how this should be done.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0ZaI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0ZaI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 424w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 848w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 1272w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0ZaI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png" width="478" height="738.6238095238095" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1298,&quot;width&quot;:840,&quot;resizeWidth&quot;:478,&quot;bytes&quot;:204707,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0ZaI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 424w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 848w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 1272w, https://substackcdn.com/image/fetch/$s_!0ZaI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1273f086-a9e5-4c63-89f3-d1951f48b683_840x1298.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 2: </strong>The percentages of Republicans and Democrats who believe that each issue should be a top priority. Reprinted from Pew (2023).</em></figcaption></figure></div><p>Guber&#8217;s analysis of Gallup polls from 1990, 2000, &amp; 2010 also shows that environmentalism is unusually partisan.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> Concern about &#8220;the quality of the environment&#8221; has a similar partisan gap as concern about &#8220;illegal immigration,&#8221; and larger than concern about any other political issue. If we hone in on concern about &#8220;global warming&#8221; within overall environmental concern, the partisan gap doubles, making it a clear outlier.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KZ9K!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KZ9K!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 424w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 848w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 1272w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KZ9K!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png" width="562" height="435.76953125" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:794,&quot;width&quot;:1024,&quot;resizeWidth&quot;:562,&quot;bytes&quot;:75639,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KZ9K!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 424w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 848w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 1272w, https://substackcdn.com/image/fetch/$s_!KZ9K!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcff08b17-2d19-4c03-99f3-721da815ae09_1024x794.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 3: </strong>Difference between the mean response on a four point scale for party identifiers on concern for various national problems in 2010. &#8220;I'm going to read you a list of problems facing the country. For each one, please tell me if you personally worry about this problem a great deal, a fair amount, only a little, or not at all.&#8221; Reprinted from Guber (2013).</em></figcaption></figure></div><p>The partisanship of environmentalism cannot be explained entirely by the processes that made other issues partisan. It is more partisan than those other issues. At least this extra partisan gap wants an explanation.</p><h2>Compared to Other Countries</h2><p>The United States is more partisan than any other country on environmentalism, by a wide margin.</p><p>The best data comes from a Pew survey of &#8220;17 advanced economies&#8221; in 2021.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> It found that 7 of them had no significant partisan gap, and that the US had a partisan gap that was almost twice as large as any other country.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Vvzp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Vvzp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 424w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 848w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 1272w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Vvzp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png" width="418" height="772.6258064516129" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1146,&quot;width&quot;:620,&quot;resizeWidth&quot;:418,&quot;bytes&quot;:118473,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Vvzp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 424w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 848w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 1272w, https://substackcdn.com/image/fetch/$s_!Vvzp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba827d6f-b37d-4915-979f-4cf11b3bc8c2_620x1146.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 4: </strong>Percentages of people with different ideologies who would be willing to make a lot of or some changes to how they live and work to help reduce the effects of global climate change, in 17 different countries. Only statistically significant differences are shown. Reprinted from Pew (2021).</em></figcaption></figure></div><p>This is evidence that environmentalism is more likely to be left-leaning. The explanation for this might involve something intrinsic to environmentalism itself, or it might involve interactions between countries and shared media environments. But it clearly is possible for environmentalism to remain bipartisan, which has happened in the UK, France, Spain, Japan, South Korea, Taiwan, and Singapore.</p><p>The United States is more partisan overall than most other countries, but it is not an outlier. There are other countries with similar levels of overall partisanship,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> but almost no partisanship in their support for environmentalism: France<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> and South Korea.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> There is no correlation between overall partisanship and partisanship in environmentalism.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a></p><h2>Compared to Other Times</h2><p>Environmentalism was a bipartisan issue in the United States as recently as the 1980s.</p><p>The longest data series for U.S. public opinion on environmentalism comes from the General Social Survey, which has been administered to thousands of Americans for most years between 1974 and 2012.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Yoda!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Yoda!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 424w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 848w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 1272w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Yoda!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png" width="824" height="469" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:469,&quot;width&quot;:824,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:195802,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Yoda!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 424w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 848w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 1272w, https://substackcdn.com/image/fetch/$s_!Yoda!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff88e8db6-51cb-4f9e-978f-1292d20cabdb_824x469.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 5: </strong>Percentages of Democrats and Republicans reporting that national spending on the environment is &#8220;Too Little,&#8221; 1974-2012. Reprinted from McCright et al. (2014).</em></figcaption></figure></div><p>During the mid-to-late 1970s, support for environmentalism was declining in both parties. Democrats were consistently about 10 percentage points (p.p.) more likely than Republicans to say that there was too little environmental spending.</p><p>During the 1980s, support for environmentalism surged. This increase was even larger among Republicans than among Democrats, with the partisan gap closing by the end of the decade.</p><p>In the 1990s and 2000s, Democrats&#8217; support for environmentalism remained roughly constant, while Republicans&#8217; support fell dramatically. A large partisan gap opened. The overall support for environmentalism declined, although this might be because support for overall government spending also fell in the early 1990s.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a></p><p>Gallup polling on similar questions only goes back to 1997.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> It shows an initially modest partisan gap of 15 p.p. in 1997, which grew to an over 50 p.p. gap in 2021.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!L41X!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!L41X!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 424w, https://substackcdn.com/image/fetch/$s_!L41X!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 848w, https://substackcdn.com/image/fetch/$s_!L41X!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 1272w, https://substackcdn.com/image/fetch/$s_!L41X!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!L41X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png" width="666" height="344.1" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:372,&quot;width&quot;:720,&quot;resizeWidth&quot;:666,&quot;bytes&quot;:30218,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!L41X!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 424w, https://substackcdn.com/image/fetch/$s_!L41X!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 848w, https://substackcdn.com/image/fetch/$s_!L41X!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 1272w, https://substackcdn.com/image/fetch/$s_!L41X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48317a00-ef2a-42f0-bc0e-ea5e003dcd1c_720x372.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><strong>Figure 6: </strong>Percentages of Republicans, Independents, and Democrats who believe that global warming will pose a serious threat to themselves or their way of life, 1997-2021. Reprinted from Gallup (2021).</em></figcaption></figure></div><p>This change is especially striking because the Republican Party did not change its positions on most issues between the 1980s and 2000s. Underlying principles like small government economics and social conservatism were common to the Republican Party of both decades. The anti-environmentalism of the Republican Party began in the 1990s, clearly after the &#8216;Reagan Revolution.&#8217;</p><h2>Conclusion</h2><p>The development of a large partisan gap about environmentalism in the United States was not inevitable. The United States has a smaller partisan gap for most other issues, other countries have less partisanship on this issue (even if the country is very partisan overall), and environmentalism was a bipartisan issue as recently as the 1980s.</p><p>This suggests that the explanation for the partisanship does not lie in broad structural or ideological factors that are consistent across many countries and times. Instead, the explanation is likely to be contingent, centered on the choices of individual decision makers.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Frank Newport. <em>Update: Partisan Gaps Expand Most of Government Power, Climate. </em>Gallup. (2023) <a href="https://news.gallup.com/poll/509129/update-partisan-gaps-expand-government-power-climate.aspx">https://news.gallup.com/poll/509129/update-partisan-gaps-expand-government-power-climate.aspx</a>.</p><p>See also:&nbsp;</p><p>Frank Newport &amp; Andrew Dugan. <em>Partisan Differences Growing on a Number of Issues. </em>Gallup (2017) <a href="https://news.gallup.com/opinion/polling-matters/215210/partisan-differences-growing-number-issues.aspx">https://news.gallup.com/opinion/polling-matters/215210/partisan-differences-growing-number-issues.aspx</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p><em>Economy Remains the Public&#8217;s Top Policy Priority; COVID-19 Concerns Decline Again. </em>Pew Research. (2023) <a href="https://www.pewresearch.org/politics/2023/02/06/economy-remains-the-publics-top-policy-priority-covid-19-concerns-decline-again/">https://www.pewresearch.org/politics/2023/02/06/economy-remains-the-publics-top-policy-priority-covid-19-concerns-decline-again/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Deborah Lynn Guber. <em>A Cooling Climate for Change? Party Polarization and the Politics of Global Warming. </em>American Behavioral Scientist <strong>57.1</strong>. (2013) p. 93 &#8211;115. <a href="https://cssn.org/wp-content/uploads/2020/12/A-Cooling-Climate-for-Change-Party-Polarization-and-the-Politics-of-Global-Warming-Deborah-Guber.pdf">https://cssn.org/wp-content/uploads/2020/12/A-Cooling-Climate-for-Change-Party-Polarization-and-the-Politics-of-Global-Warming-Deborah-Guber.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>James Bell, Jacob Poushter, Moira Fagan &amp; Christine Huang. <em>In Response to Climate Change, Citizens in Advanced Economies Are Willing To Alter How They Live and Work. </em>Pew Research. (2021) <a href="https://www.pewresearch.org/global/2021/09/14/in-response-to-climate-change-citizens-in-advanced-economies-are-willing-to-alter-how-they-live-and-work/">https://www.pewresearch.org/global/2021/09/14/in-response-to-climate-change-citizens-in-advanced-economies-are-willing-to-alter-how-they-live-and-work/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Laura Silver. <em>Most across 19 countries see strong partisan conflict in their society. </em>Pew Research. (2022) <a href="https://www.pewresearch.org/short-reads/2022/11/16/most-across-19-countries-see-strong-partisan-conflicts-in-their-society-especially-in-south-korea-and-the-u-s/">https://www.pewresearch.org/short-reads/2022/11/16/most-across-19-countries-see-strong-partisan-conflicts-in-their-society-especially-in-south-korea-and-the-u-s/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Macron &amp; Le Pen seem to have fairly similar climate policies. Both want France's electricity to be mostly nuclear &#8211; Le Pen more so. Both are not going to raise fuel taxes &#8211; Macron reluctantly. Le Pen talks more about hydrogen and reshoring manufacturing from countries which emit more. Macron supports renewables in addition to nuclear power. The various leftists (socialists, greens, and communists run separately in recent elections) seem to be interested in phasing out nuclear &amp; replacing it with renewables. None of the parties dismiss climate change as an issue and all are committed to following international climate agreements.</p><p>Kate Aronoff. <em>Marine Le Pen&#8217;s Climate Policy Leans Ecofascist. </em>The New Republic. (2022) <a href="https://newrepublic.com/article/166097/marine-le-pens-climate-policy-whiff-ecofascism">https://newrepublic.com/article/166097/marine-le-pens-climate-policy-whiff-ecofascism</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>Heesu Lee. <em>Climate Is the New &#8216;Must-Have&#8217; in South Korean Election Gameplan. </em>Bloomberg. (2024) <a href="https://www.bloomberg.com/news/articles/2024-04-04/climate-is-the-new-must-have-in-south-korean-election-gameplan">https://www.bloomberg.com/news/articles/2024-04-04/climate-is-the-new-must-have-in-south-korean-election-gameplan</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>There are 14 countries in both the Pew survey on environmentalism and the Pew survey on overall partisanship. There is no correlation between the fraction of people who say that there are strong or very strong conflicts between people who support different parties in their country vs. the left-right difference between people who say that they are willing to make a lot of or some changes to how they live and work to help reduce the effects of global climate change. <a href="https://docs.google.com/spreadsheets/d/1h14JsezOloAUqy78MBo_JPpwiAjUcwQ-z8V5wuihUk8/edit?usp=sharing">https://docs.google.com/spreadsheets/d/1h14JsezOloAUqy78MBo_JPpwiAjUcwQ-z8V5wuihUk8/edit?usp=sharing</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Aaron M. McCright, Chenyang Xiao, &amp; Riley E. Dunlap. <em>Political polarization on support for government spending on environmental protection in the USA, 1974-2012. </em>Social Science Research <strong>48</strong>. (2014) p. 251-260. <a href="https://www.sciencedirect.com/science/article/abs/pii/S0049089X1400132X">https://www.sciencedirect.com/science/article/abs/pii/S0049089X1400132X</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p><em>Little Public Support for Reductions in Federal Spending. </em>Pew Research. (2019) <a href="https://www.pewresearch.org/politics/2019/04/11/little-public-support-for-reductions-in-federal-spending/">https://www.pewresearch.org/politics/2019/04/11/little-public-support-for-reductions-in-federal-spending/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Lydia Saad. <em>Global Warming Attitudes Frozen Since 2016. </em>Gallup. (2021) <a href="https://news.gallup.com/poll/343025/global-warming-attitudes-frozen-2016.aspx">https://news.gallup.com/poll/343025/global-warming-attitudes-frozen-2016.aspx</a>.</p><p>Note that there are several similar questions, all which show a small or zero partisan gap when the data starts, which grows dramatically in time.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Essay competition on the Automation of Wisdom and Philosophy — $25k in prizes]]></title><description><![CDATA[We&#8217;re pleased to announce an essay competition on the automation of wisdom and philosophy. Submissions are due by July 14th. The first prize is $10,000, and there is a total of $25,000 in prizes available.]]></description><link>https://blog.aiimpacts.org/p/essay-competition-on-the-automation</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/essay-competition-on-the-automation</guid><pubDate>Mon, 15 Apr 2024 23:25:07 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/bc3000ac-da50-4193-a249-5bd89c00c255_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>By Owen Cotton-Barratt</em></p><p>With AI Impacts, we&#8217;re pleased to announce an essay competition on the automation of wisdom and philosophy. Submissions are due by July 14th. The first prize is $10,000, and there is a total of $25,000 in prizes available.</p><h2>Background</h2><p>AI is likely to automate more and more categories of thinking with time.</p><p>By default, the direction the world goes in will be a result of the choices people make, and these choices will be informed by the best thinking available to them. People systematically make better, wiser choices when they understand more about issues, and when they are advised by deep and wise thinking.</p><p>Advanced AI will reshape the world, and create many new situations with potentially high-stakes decisions for people to make. To what degree people will understand these situations well enough to make wise choices remains to be seen. To some extent this will depend on how much good human thinking is devoted to these questions; but at some point it will probably depend crucially on how advanced, reliable, and widespread the automation of high-quality thinking about novel situations is.</p><p>We believe<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> that this area could be a crucial target for differential technological development, but is at present poorly understood and receives little attention. This competition aims to encourage and to highlight good thinking on the topics of what would be needed for such automation, and how it might (or might not) arise in the world.</p><p>For more information about what we have in mind, see some of the suggested essay prompts or the FAQ below.</p><h2>Scope</h2><p>To enter, please submit a link to a piece of writing, not published before 2024. This could be published or unpublished; although if selected for a prize we will require publication (at least in pre-print form; optionally on the AI Impacts website) in order to pay out the prize.&nbsp;</p><p>There are no constraints on the format &#8212; we will accept essays, blog posts, papers<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a>, websites, or other written artefacts<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> of any length. However, we primarily have in mind essays of 500&#8211;5,000 words. AI assistance is welcome but its nature and extent should be disclosed. As part of your submission you will be asked to provide a summary of 100&#8211;200 words.&nbsp;</p><p>Your writing should aim to make progress on a question related to the automation of wisdom and philosophy. A non-exhaustive set of questions of interest, in four broad categories:</p><ul><li><p><strong>Automation of wisdom</strong></p><ul><li><p>What is the nature of the sort of good thinking we want to be able to automate? How can we distinguish the type of thinking it&#8217;s important to automate well and early from types of thinking where that&#8217;s less important?</p></li><li><p>What are the key features or components of this good thinking?</p><ul><li><p>How do we come to recognise new ones?</p></li></ul></li><li><p>What are traps in thinking that is smart but not wise?</p><ul><li><p>How can this be identified in automatable ways?</p></li></ul></li><li><p>How could we build metrics for any of these things?</p></li></ul></li><li><p><strong>Automation of philosophy</strong></p><ul><li><p>What types of philosophy are language models well-equipped to produce, and what do they struggle with?</p></li><li><p>What would it look like to develop a &#8220;science of philosophy&#8221;, testing models&#8217; abilities to think through new questions, with ground truth held back, and seeing empirically what is effective?</p></li><li><p>What have the trend lines for automating philosophy looked like, compared to other tasks performed by language models?</p></li><li><p>What types of training/finetuning/prompting/scaffolding help with the automation of wisdom/philosophy?</p><ul><li><p>How much do they help, especially compared to how much they help other types of reasoning?</p></li></ul></li></ul></li><li><p><strong>Thinking ahead</strong></p><ul><li><p>Considering the research agenda that will (presumably) eventually be needed to automate high quality wisdom/philosophy:</p><ul><li><p>Which parts of the agenda can we expect to automate in a timely fashion?&nbsp;</p></li><li><p>What is the core that we will need humans to address?</p></li><li><p>What do we expect the thorny sticking points to be?</p></li></ul></li><li><p>Why may or may not this problem be solved &#8220;by default&#8221;? (from a technical standpoint)</p></li><li><p>Can we tell concrete stories or vignettes in which the automation of wisdom/philosophy is/isn&#8217;t important, to triangulate our understanding of what matters?</p></li><li><p>What preparatory research could provide the best groundwork for humanity to automate high-quality wisdom/philosophy before it is necessary?</p></li><li><p>What projects today or in the near future would be valuable to undertake?</p></li></ul></li><li><p><strong>Ecosystems</strong></p><ul><li><p>If the world were devoting serious attention to this, what would that look like?</p><ul><li><p>What incentives on institutional actors could push work onto related but less important questions; vice-versa what could help ensure that work remained well-targeted?</p></li></ul></li><li><p>What are the natural institutional homes for this research in the short term?</p><ul><li><p>Academia? Nonprofits? Frontier AI labs? Elsewhere in industry?</p></li></ul></li><li><p>What might be needed (proofs, audits, track record?) to enable humans (decision-makers, voters) and human institutions to correctly trust wise advice from AI systems?</p><ul><li><p>How could we lay the groundwork for this?</p></li></ul></li><li><p>Ideas for catalysing/sustaining this field?</p></li><li><p>Why may or may not this problem be solved &#8220;by default&#8221;? (from a social standpoint)</p></li></ul></li></ul><p>If you&#8217;re not sure whether a topic would be within scope, feel free to check with us.</p><h2>Judging</h2><p>The judging process will be coordinated by Owen Cotton-Barratt. After shortlisting, entries will be assessed by a panel of judges: <a href="https://stuhlmueller.org/">Andreas Stuhlm&#252;ller</a>, <a href="https://sites.google.com/a/brown.edu/brad-saad/?pli=1">Brad Saad</a>, <a href="https://davidmanley.squarespace.com/">David Manley</a>, <a href="https://forum.effectivealtruism.org/users/chi">Linh Chi Nguyen</a>, and <a href="http://www.weidai.com/">Wei Dai</a>.</p><p>Judging criteria will be:</p><ul><li><p>Does the entry tackle an important facet of the automation of wisdom/philosophy?</p></li><li><p>Does the entry contain good analysis or valuable new ideas?</p></li><li><p>Is the writing clear, succinct, and epistemically appropriate?</p></li><li><p>Does the entry provide something that we are excited to see built upon or explored further?&nbsp;</p></li></ul><p>The prize pool is $25,000, and the prize schedule will be:</p><ul><li><p>$10,000 First Prize</p></li><li><p>$5,000 Second Prize</p></li><li><p>4x $2,000 Best-in-Category Prizes&nbsp;</p><ul><li><p>Judging for these will exclude the overall First and Second Prize winners from consideration</p><ul><li><p>So if e.g. the overall First Prize and Second Prize both went to entries in the &#8220;Ecosystems&#8221; category, then the third-best entry in that category would receive $2,000&nbsp;</p></li></ul></li></ul></li><li><p>4x $500 Runner Up Prize, for the best entries across any category that did not receive another prize</p><ul><li><p>For these prizes, the judges may give preference to impressive entries by people at early career stages</p><ul><li><p>Whereas judging for the main prizes will &#8212; insofar as this is feasible &#8212; be blind to the identities and personal characteristics of the authors</p></li></ul></li></ul></li></ul><p>We may contact entrants whose work impresses us about possible further opportunities (e.g. conferences or research positions) on these topics.</p><h2>Details</h2><p>Entries should be submitted via <a href="https://forms.gle/CtcX2pp1Jg1JJkZK9">this form</a>, which asks for:</p><ul><li><p>Your name and email address</p></li><li><p>A link to your entry</p></li><li><p>A 100&#8211;200 word summary</p></li><li><p>Which if any of our four categories your entry falls under</p></li><li><p>Statement of authorship credit (including AI credit)</p></li><li><p>A brief description of career stage (so that judges can at their discretion account for this in awarding Runner Up prizes)</p></li><li><p>Opportunity to opt out of future contact not directly related to this competition</p></li><li><p>Anything else we should know</p></li></ul><p>You are of course welcome to seek feedback on drafts before submission. Coauthored articles are also very welcome.</p><p>The deadline for submissions is midnight anywhere in the world on Sunday 14th July. We hope to complete shortlisting within two weeks of the submission deadline, and contact winners within four weeks of the submission deadline. Winners whose entries are not yet public will have two weeks after we contact them to provide a public version, or agree to us publishing it on the AI Impacts website. Payment will be made by ACH (for US-based winners) or wire transfer (for international winners).</p><p>We reserve the right to extend the submission deadline or increase the prize pool without notice. Judges have the right to split prizes in cases of ties, or to not award prizes in the unlikely event that no submissions are found to merit them.</p><p>If you want to ask questions about the competition, feel free to comment, or to email essaycompetition@aiimpacts.org</p><h2>FAQ on the automation of wisdom and philosophy</h2><h3>What&#8217;s the basic idea here?</h3><p>We're interested in the automation of thinking that can help actors to take wise actions (whatever that means) and avoid unwise actions. As an important subcategory, we're interested in the automation of philosophical thinking, and how to avoid practical errors grounded in philosophical mistakes.</p><h3>What do you want to know about such automation?</h3><p>We're not certain! We think it's a potentially important area which hasn't received that much attention. We'd like people to explore more of the ideas around this. If we understood more of the contours of when such automation might be helpful (or unhelpful!), that would seem good. If we understood more about what would be necessary for automation, that would seem good. If people developed a sense of things it would be good for someone to do in the world, that's potentially great.</p><p>We give a bunch of example questions we'd be interested in people addressing in the essay prompts part of the announcement, but because it seems like a broad area we've preferred to leave the competition fairly open, and wait to see which parts people can make meaningful contributions to.</p><h3>What do you mean by &#8220;wisdom&#8221; and &#8220;philosophy&#8221;?</h3><p>By &#8220;wisdom&#8221;, we mean something like &#8220;thinking/planning which is good at avoiding large-scale errors&#8221;. An archetype of something which is smart-but-not-wise might be a plan full of clever steps which are each individually well-chosen to chain to the previous step in the plan, but which collectively forget why they were doing this, and end up taking actions which are in conflict with the original goal. Wisdom is also what&#8217;s needed for noticing that an old ontology was baking in some problematic assumptions about what was going on.</p><p>By &#8220;philosophy&#8221;, we mean something like &#8220;the activity of trying to find answers by thinking things through, without the ability to observe answers&#8221;. This is close to the sense understood in the academic discipline of philosophy.</p><p>We&#8217;re not sure if automating these things is most naturally thought of as one topic, two topics, or more &#8230;</p><h3>What threats are you concerned about?</h3><p>Progress in these areas seems like it could potentially help avoid a number of different issues:</p><h4>Unwise human actions</h4><p>Humans sometimes take actions which are predictably unwise (from some perspectives), and which they later regret. Such actions could be really bad if they interact with high stakes situations. If people had access to trusted high wisdom automated advice, this could help them to reduce the rate of these errors.</p><p>This might be particularly important around issues coming with the development of AI, as people will be facing very novel situations and be less able to rely on experience.</p><h4>Human philosophical errors</h4><p>People sometimes make decisions that are influenced by their philosophical understanding of an issue. This could happen in the future, e.g. around understanding of AI consciousness/rights. Automation of good work, if achievable, could help people to have deeper understanding by the times they need to make key decisions.</p><h4>Unwise AI actions</h4><p>If people empower AI agents, ensuring that they are in some sense wise and not just smart could help to reduce rare damaging actions. In the extreme this could reduce risk of human extinction (imagine an AI system which wipes out humans in order to secure its own power, and later on reflection wishes it hadn't; a wiser system might have avoided taking that action in the first place).</p><h4>AI philosophical errors</h4><p>If AI systems become superintelligent and are meaningfully running the world, their stances on philosophical questions could matter. e.g. deciding to engage in acausal trade (if it doesn&#8217;t actually make sense), or deciding not to (if it does) could be a large and consequential error. Better understanding of the automation of philosophy could help either to lead to more philosophically-competent AI systems, or alternatively could help people to coordinate about which parts of thinking should not be delegated to AI systems.</p><h3>Is there a particular threat model you&#8217;re focused on?</h3><p>No. We could make some guesses (both about which of the above categories are most concerning, and more concretely what the most concerning threats within them are), but we feel like the whole area is under-explored, and wouldn&#8217;t be confident in our guesses. We&#8217;d love to see high-quality analysis of this.</p><p>The fact that the automation of wisdom/philosophy seems important to better understand for multiple different threats &#8212; and also seems like a plausibly useful intervention for improving our ability to handle unknown unknowns &#8212; feeds into our desire to see it prioritized more than at present.</p><h3>Automating wisdom, philosophy &#8212; isn&#8217;t this all just AI capabilities work?</h3><p>Maybe! Certainly this is a type of capability (and high performance probably requires significantly advanced general capabilities, relative to today).</p><p>However, it seems to us that for a given level of general smarts in a system, the capacity for wisdom or philosophy could keep up with that, or could fail to. We are concerned about worlds where the ability to automate wise actions is outstripped by the ability to automate smart ones. So it seems like it may (at least in part) be a problem of differential technological development. We would be interested in further analysis of this question.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>The precise opinions expressed in this post should not be taken as institutional views of AI Impacts, but as approximate views of the competition organizers. We offer them not because we're sure they're exactly right, but because we think they're pointing in a promising direction and it's more likely to provoke high quality interesting entries if we provide some concrete starting points.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>We recognise that the timeline may be on the tight side for thoroughly researched papers. We are very happy to consider papers (and note that most journals accept papers that have been available as pre-prints, e.g. see &nbsp;<a href="https://philarchive.org/journals.html">https://philarchive.org/journals.html</a> for philosophy journals), but for entrants who are targeting academic publication we also welcome people putting the heart of their argument into an essay for the competition and later expanding it into a paper.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Feel free to use unusual formats if you consider them best for exploring the ideas. e.g. we would be happy to receive a fictional business plan or technical roadmap for a hypothetical firm working on a challenge in these areas.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Historical Note: Was a Subway in New York City Inevitable?]]></title><description><![CDATA[The first serious attempt at building a subway in New York City occurred in 1866. The following decades saw a sequence of 15 failed attempts, and the first subway in New York City would not begin operations until 1904. When we consider how popular the subway was, these failures are remarkable. Many of the important actors were indifferent or opposed, and those who supported building a subway were comically bad at coordinating. Only widespread, mostly decentralized public support was able to pressure different political and economic elites into finally cooperating. The history of these efforts made me wonder whether the construction of a subway in New York City was inevitable.]]></description><link>https://blog.aiimpacts.org/p/historical-note-was-a-subway-in-new</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/historical-note-was-a-subway-in-new</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Sat, 30 Mar 2024 00:30:50 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/c5ef5531-820e-419c-b4d5-30ef28cf8353_1024x817.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>Introduction</h2><p>The first serious attempt at building a subway in New York City occurred in 1866, following the end of the Civil War (1865) and the opening of the first subway in London (1863). The following decades saw a sequence of failed attempts, and the first subway in New York City would not begin operations until 1904.</p><p>When we consider how popular the subway was, even when it first opened, these failures are remarkable. Many of the important actors were indifferent or opposed, and those who supported building a subway were comically bad at coordinating. Only widespread, mostly decentralized public support was able to pressure different political and economic elites into finally cooperating.</p><p>The history of these efforts made me wonder whether the construction of a subway in New York City was inevitable. I am usually skeptical of technological inevitability, but this seems like a potentially good example. For decades, the New York City subway was a locally resisted technological temptation,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> and a potential example of technological overhang.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> The subway eventually succeeded despite a system that seemed structured to thwart it.</p><p>The main source I read about this history was:</p><blockquote><p>Wallace B. Katz. <em>The New York rapid transit decision of 1900: Economy, society and politics.</em> (1978),<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> from a series put together by the Historic American Engineering Record.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a></p></blockquote><p>All quotes or claims that are not cited come from this source. I have not investigated other sources in detail, but it seems to present a mainstream scholarly view of the history.</p><h2>Why Did New York City Need A Subway?</h2><p>By 1866, it seems inevitable that New York City would become one of the greatest world cities. It has an excellent harbor and a deep river heading inland. After the completion of the Erie Canal (1825), most of the Midwest&#8217;s international trade passed through New York. It has been the largest city in the US since the first census (1790),<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> and was among the 10 largest in the world by 1850.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> Network effects would help it continue to grow. The New York Stock Exchange brought in large businesses, especially railroads. New York City was also a major destination for immigrants.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a></p><p>Rapid transit was even more important for New York than for other great world cities. Manhattan is an island, and so New York City has fewer directions it can grow in than London, Paris, or Beijing. Manhattan is only two miles wide, and the central business district was at the end, in downtown.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> People would commute much of the length of the island or across the rivers on ferries.</p><p>New York City had both surface and elevated trains before the subway was built. The surface trains (streetcars) had an extensive network, with many operators who eventually coalesced into the Metropolitan Traction Company. They could not travel faster than other traffic on the streets, so even an extensive network with good transfers wasn&#8217;t effective for longer trips. There were also some elevated lines (els) run by the Manhattan Railway Company. These were faster than the streetcars, but still had limited speed unless their steel viaducts were replaced by large, expensive, and more stable stone viaducts. Subways could go faster because they were built on solid ground. In practice, the subway was built with four tracks to provide both local and express service, while the els were built with only two tracks.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a></p><p>In the absence of a subway, most of the new immigrants packed into extremely dense tennant houses in the Lower East Side of Manhattan. These slums were widely understood to be bad for the health, wealth, and moral progress of the people who lived in them. The hope was that the subway would allow poor immigrants to access the job opportunities of downtown Manhattan without having to live in the slums of the Lower East Side.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a></p><h2>Major Actors</h2><p>The major actors involved in building the New York subway included:</p><ul><li><p><strong>Tycoons.&nbsp;<br><br></strong>Most infrastructure in the US had been built by private corporations or conglomerates, including New York&#8217;s streetcars and els. Most people assumed that some tycoon would be the one to build and operate the subway.<br><br>Most tycoons were uninterested in building the subway, because the construction would cost more than other transit possibilities. They believed that intercity rail was more likely to achieve an acceptable rate of return: at least 6% annually.<br><br>New York City&#8217;s Chamber of Commerce consisted of and represented the interests of the tycoons.<br></p></li><li><p><strong>Reformers.</strong>&nbsp;<br><br>These people were mostly interested in improving the lives of the people living in the slums. The subway would provide a great public service, and would confirm New York&#8217;s preeminent place among the greatest cities in the world.<br><br>This group largely overlaps with the tycoons.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> Magnanimously building great public works was popular among the business elites - especially if these public works could also make a decent rate of return. The tycoons who were interested in urban transit (which was a minority) had progressive tendencies.<br><br>The Chamber of Commerce also represented the reformers.<br></p></li><li><p><strong>Tammany Hall.</strong>&nbsp;<br><br>New York City&#8217;s local politics was dominated by Tammany Hall. Their base was largely poor immigrant communities, where they would exchange favors for votes. They were engaged in many instances of small-scale corruption and had earned a reputation of &#8220;lay[ing] the hand of spoliation upon the public funds.&#8221;&nbsp;<br><br>Occasionally, a coalition of all other political factions in the city could defeat the Tammany candidate for mayor, or a Tammany-backed mayor would act independently once in office. For the most part, dealing with the city government meant dealing with Tammany Hall.<br><br>Tammany Hall was mostly indifferent to whether a subway was built. All of the other actors believed that Tammany was incapable of running a railroad and, if they gained control over one, they would use it for petty patronage.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a><br></p></li><li><p><strong>State Government.&nbsp;<br><br></strong>Since the building of a railroad requires the exercise of state power (eminent domain), private companies would have to get a charter from the state legislature and sometimes approval from the state supreme court.<br><br>Most people in New York at the time lived in rural areas, so the state government was indifferent to whether New York City had a subway. They did not want to increase the power of their political rivals in the city, especially Tammany Hall.<br></p></li><li><p><strong>Metropolitan (streetcar) and Manhattan (els) Companies.&nbsp;<br><br></strong>They largely opposed the construction of a subway, because it would mean competition with their services. They had extensive economic and political connections across the city and could make things very difficult for anyone else interested in building or operating transit in New York City.<br><br>It is interesting that neither of them made a serious effort to build a subway themselves.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a> Controlling multiple forms of transit would have given them a more substantial monopoly.<br><br>There were both technical and financial reasons for their lack of interest. Financially, both companies had raised capital using &#8216;watered&#8217; stock. This left them paying larger dividends to their shareholders than they could reasonably afford. Technically, they had other things they wanted to focus on: consolidating lines after purchasing formerly independent railroads, improving transfers, possibly adding a third set of rails to increase capacity, and electrification. Starting a whole new system seemed like a lot of additional work.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a><br></p></li><li><p><strong>Local Businesses.&nbsp;<br><br></strong>Local businesses, especially along Broadway, did not want the disruption that subway construction would entail. These are the NIMBYs.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> They were successful at making sure that the first subway would not run under the street through downtown which had the highest demand.</p></li></ul><h2>Basic Dynamics</h2><p>The efforts to get a subway built occurred multiple times, with sixteen different companies being awarded a charter between 1864 and 1902. This is a large enough sample size to get a sense of the underlying dynamics at play.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a></p><p>The impetus for subway construction typically came from progressives in the Chamber of Commerce, especially Abram Hewitt. They would draft a piece of legislation for the state legislature to consider. The legislation would create a commission empowered to grant a charter to a private company to build and operate the railroad. The commission would solicit bids<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a> from potential tycoons or companies. The auction would fail to attract serious candidates, and the process would begin again.</p><p>The tycoons of the day were not interested in building the subway using their own money, because they did not believe that it would yield a significant rate of return. The city agreed to supply most of the capital, and so the city would legally own the subway. The construction would be done by the private company, and they would lease the railway and run the operations for an extended period of time.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a> The company would be obligated to repay the city at a fixed rate of return, but could keep the profits beyond that. The offer of public capital was supposed to incentivize some prominent businessman to build and run the subway, who had the necessary technical skills and would keep the subway out of the hands of Tammany Hall.</p><p>The details of the offered charter were determined by negotiations between the Chamber of Commerce, the state government, and city hall. These negotiations would determine who would be on the commission that reviewed the bids, how much capital would be provided publicly vs privately, what rate of return the city expected on its investment, and how long the lease would last. The result of the negotiations would end up being something that no prominent businessman would accept.</p><p>There would be some bidders. Sometimes, they would be people with insufficient financial backing or experience building railroads. Sometimes, they would be people interested in construction but not operations.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a> Sometimes, the charter would be awarded, and the bidder would reveal that they were a front for someone who wanted to ensure that the subway was not built.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a></p><p>This system was not working, and it did not look like it was making progress towards working in the future either.</p><h2>What Changed?</h2><p>Mass public support eventually forced the various elite groups to cooperate.</p><p>In 1893, when the Chamber of Commerce proposed yet another subway bill to the state legislature, New York City&#8217;s labor unions submitted a rival bill,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-21" href="#footnote-21" target="_self">21</a> which proposed a popular referendum. Although both groups wanted to see a subway built, the Chamber of Commerce opposed the referendum as a gateway to anarchy. The compromise bill that passed mostly followed the Chamber of Commerce&#8217;s proposal, but did include a referendum.</p><p>The referendum passed with over 75% of the vote. The Chamber of Commerce dominated commission reluctantly found itself with broad popular support.</p><p>The referendum did provide some additional constraints: the subway would have to run the entire length of Manhattan, and would charge 5&#162; to go anywhere in the city. Despite these constraints, broad popular support was extremely useful.</p><p>After the state supreme court added some additional constraints,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-22" href="#footnote-22" target="_self">22</a> a delegation of working men visited the commission and told them that &#8220;the working people were surprised to see the Commission &#8216;knocked out&#8217; in one round against five judges &#8230; the law cannot be bigger than the will of the People.&#8221;</p><p>In 1898, the commission was still looking for offers, and was seriously looking at a proposal from the Metropolitan Traction Company.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-23" href="#footnote-23" target="_self">23</a> The proposal was contrary to the letter of the law and especially to the spirit of the referendum: they asked for a perpetual franchise, no taxes until after the line had paid for itself, and 10&#162; express services. The public was not happy: there were mass meetings throughout the city for weeks and almost every civic organization publicly opposed it, denouncing the Metropolitan, the commission, and Tammany together. Governor Theodore Roosevelt stepped in to stop the deal.</p><p>Mass public will had become strong enough that elite groups became willing to work together to build a subway.</p><p>The 1900 auction was attended by two bidders and the contract was awarded to John McDonald with (initially secret) backing from August Belmont II. McDonald had experience building railways in Baltimore, and Belmont had previously been involved in the management of the Long Island Rail Road. Both were tycoons, although not quite of the status the commission had originally intended. Both also had political connections to Tammany Hall. It is unclear what exactly was involved in the decision (or deal<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-24" href="#footnote-24" target="_self">24</a>) that gave McDonald &amp; Belmont the contract, but the city had finally found people to build the subway who were willing and acceptable to most of the city&#8217;s elites.</p><p>The Metropolitan hadn&#8217;t quite given up. They convinced two financial institutions to renege on bonds they had agreed for McDonald. Belmont had to step in, providing much of the money himself and creating a company to sell (not watered) stock, in exchange for most of the profits.</p><p>Belmont&#8217;s company, the Interborough Rapid Transit Company, built the first subway line in New York City between 1900-1904, and soon afterwards extended it under the East River to Brooklyn.</p><p>By the time the line began construction, all of the technical details had been hashed out.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-25" href="#footnote-25" target="_self">25</a> The route had been extensively litigated. Everyone knew that the best railroad for urban transit was underground, powered by &#8220;electricity,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-26" href="#footnote-26" target="_self">26</a> and as near the surface as practicable.&#8221;</p><h2>Conclusions</h2><p>It seems like New York City tried to fail at building a subway system. There was some significant opposition, from the existing transit companies and (to a lesser extent) NIMBY business owners along Broadway. More important than outright hostility seems to be the indifference and conflict between various elite groups in New York.</p><p>Political leaders, both at Tammany Hall and the state legislature, didn&#8217;t care much about the project. Most of the efforts were instigated by civic-minded tycoons in the Chamber of Commerce. They proved unsuccessful at getting the political leaders to offer a contract that would convince prominent businessmen that this enterprise was worth pursuing. The resulting mess lasted for almost 40 years.</p><p>Public support for the subway seems both broad and deep. Katz does not describe how the public became so convinced, but I can guess at the outlines of a story. The problems of overcrowded slums and the difficulty of traveling across Manhattan would have been obvious. The public also had to be informed that better rapid transit was possible. Once they became aware, it is unsurprising that they supported it.</p><p>The public was eventually able to impose their will on the elite groups, offering more support than sometimes even the commission leading the efforts. They used a referendum, letters &amp; delegations, mass meetings, and public statements by unrelated civic organizations to make their voice heard. This provided the pressure to get various elite groups to cooperate, until Tammany-friendly tycoons agreed to build the subway.</p><p>The result feels inevitable because of how it was pushed from below, in spite of decisions made by many individual actors. Elite indifference and opposition from a few key actors delayed the subway by a generation, but was not able to completely resist the public will.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><p><em><strong>Cover image:</strong> David Sagarin. Historic American Engineering Record. IRT East Side Line at 23rd Street. Library of Congress, Prints and Photographs Division (1978). <a href="https://www.nycsubway.org/perl/show?7860">https://www.nycsubway.org/perl/show?7860</a>.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p><em>Resisted Technological Temptations Project. </em>AI Impacts Wiki. (Accessed Feb 7, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/resisted_technological_temptations_project">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/resisted_technological_temptations_project</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>The metric to look at would be something like the time it takes to travel from 59th Street to South Ferry. I expect that there is an archive of historical schedules for transit in New York City, including subway lines, elevated trains, streetcars, and stagecoaches. I have not done the investigation, but I expect that this archive would allow you to determine how this travel time changed in this era. This data might show growth as the els were built during the 1870s, then a plateau during the 1880s and 1890s, then rapid growth or a discontinuity when the subway opened in 1904.</p><p>Jeffrey Heninger. <em>Are There Examples of Overhang for Other Technologies? </em>AI Impacts Blog. (2023) <a href="https://blog.aiimpacts.org/p/are-there-examples-of-overhang-for">https://blog.aiimpacts.org/p/are-there-examples-of-overhang-for</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Wallace B. Katz. <em>The New York rapid transit decision of 1900: Economy, society and politics.</em> Survey Number HAER-122, Historic American Engineering Record, National Park Service. (1978) p. 2-144. <br><a href="https://www.nycsubway.org/wiki/The_New_York_Rapid_Transit_Decision_of_1900_(Katz)">https://www.nycsubway.org/wiki/The_New_York_Rapid_Transit_Decision_of_1900_(Katz)</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>&nbsp;<em>The Interborough Subway. </em>Historic American Engineering Record, National Park Service, Department of the Interior, Washington, DC. 20240. (Accessed from nycsubway.org on Feb 15, 2024) <a href="https://www.nycsubway.org/wiki/The_Interborough_Subway_(Historic_American_Engineering_Record)">https://www.nycsubway.org/wiki/The_Interborough_Subway_(Historic_American_Engineering_Record)</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Philadelphia was larger in 1776, and Philadelphia&#8217;s city boundaries were smaller than New York&#8217;s, so New York would not be the largest metro area for a few more decades.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p><em>World&#8217;s Largest Cities, 1850. </em>The Geography of Transport Systems. (Accessed Feb 7, 2023) <a href="https://transportgeography.org/contents/chapter8/transportation-urban-form/world-largest-cities-1850/">https://transportgeography.org/contents/chapter8/transportation-urban-form/world-largest-cities-1850/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>Most large cities grow mostly because people within their country move from rural areas to urban areas. This is less the case in the US before 1900. Because of the Homestead Acts, many people would move to more rural areas in the West, while immigrants would move into American cities. There was some rural-to-urban migration in the US in the late nineteenth century, but it did not become the main driver of urbanization until the twentieth century.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Midtown&#8217;s central business district developed in the early twentieth century.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>By the 1920s, the average speed of the streetcars was 8 mph, the average speed of the els was 14 mph, the average speed of local subway service was 15 mph, and the average speed of express subway service was 25 mph.</p><p>Clifton Hood. <em>The Impact of the IRT on New York City.</em> Survey Number HAER-122, Historic American Engineering Record, National Park Service. (1978) p. 145-206. <a href="https://www.nycsubway.org/wiki/The_Interborough_Subway_(Historic_American_Engineering_Record)">https://www.nycsubway.org/wiki/The_Interborough_Subway_(Historic_American_Engineering_Record)</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>This hope does not seem to have been fulfilled when the first subway was completed. The original goal was to have the subway built out past the region with high land prices, so there could be better places for poor immigrants to live. By the time it was built, even the northern end of Manhattan had high enough land prices to prevent people moving there from the slums.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>The Progressives of the early twentieth century who dominated the reform movement then would often be opposed to tycoons, but this is still the late nineteenth century.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>The worries were both that the subway would be mismanaged, and that the revenue it generated would be used to feed the Tammany political machine. Some specific things to be concerned about included hiring based on political connections, payroll padding, not enforcing fares for favored groups, neglecting maintenance, and (once the subway became multi-lined) poor scheduling leading to conflicts between trains. It&#8217;s not clear whether these concerns would have been realized, but later generations of city officials in New York could be extremely incompetent.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>They would sometimes submit subway proposals, but the goal of these proposals was to prevent anyone else from building a subway.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>The technical problems probably could have been made manageable. The company that did build the subway bought the els in Manhattan to strengthen their monopoly while the subway was still under construction.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>&#8216;NIMBY&#8217; is an acronym for &#8216;Not In My BackYard.&#8217; They oppose development in their local area, not because they dislike development, but because they do not want their particular neighborhood to change character.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>I am describing the entire process for a typical attempt. When an attempt failed, it would go back to an earlier stage, but sometimes not all the way back to the beginning. There were fewer pieces of legislation and commissions than there were auctions and charters.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>The first rendition did not have bids. The plan was to ask Vanderbilt, who had recently built some intercity rail into New York City, to build a subway. He refused, and so the commission failed.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p>The els in Manhattan had a 999 year lease. The subway ended up with a 50 year lease. In 1940, the city would end up purchasing both private subway companies that were then operating in New York City, even though their leases were not up yet.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p>This is unacceptable because it might lead to a situation where the city government, and therefore Tammany Hall, ends up in charge of operating the subway.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p>The city would not give them the money in this case, but it would still delay the subway for a few more years.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-21" href="#footnote-anchor-21" class="footnote-number" contenteditable="false" target="_self">21</a><div class="footnote-content"><p>The immediate reason that labor unions submitted a bill was because of a recent economic downtown, the Panic of 1893. Subway construction would provide jobs for their unemployed members. The labor unions had not submitted bills during previous economic downturns, like the much more severe Panic of 1873. This bill is itself evidence of increasing public support for a subway independent of the reformers in the Chamber of Commerce.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-22" href="#footnote-anchor-22" class="footnote-number" contenteditable="false" target="_self">22</a><div class="footnote-content"><p>The first subway would not be built under Broadway and the total cost for the line extending the length of Manhattan had to be clearly less than $50 million.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-23" href="#footnote-anchor-23" class="footnote-number" contenteditable="false" target="_self">23</a><div class="footnote-content"><p>It&#8217;s not clear whether the Metropolitan Traction Company would have gone through with their offer, or whether they would return the city&#8217;s money and declare it impossible. This at least seems to have been more real of a proposal than some of their earlier efforts.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-24" href="#footnote-anchor-24" class="footnote-number" contenteditable="false" target="_self">24</a><div class="footnote-content"><p>Later muckrakers would claim that this decision was predetermined by a backroom deal between the Chamber of Commerce and Tammany Hall, although everyone involved denied this.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-25" href="#footnote-anchor-25" class="footnote-number" contenteditable="false" target="_self">25</a><div class="footnote-content"><p>A description of the debates over the technical details of the subway can be found in Part 1 of another article in the series:</p><p>Charles Scott. <em>Design and Construction of the IRT: Civil Engineering. </em>Survey Number HAER-122, Historic American Engineering Record, National Park Service. (1978) p. 207-282. <a href="https://www.nycsubway.org/wiki/Design_and_Construction_of_the_IRT:_Civil_Engineering_(Scott)">https://www.nycsubway.org/wiki/Design_and_Construction_of_the_IRT:_Civil_Engineering_(Scott)</a>.&nbsp;</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-26" href="#footnote-anchor-26" class="footnote-number" contenteditable="false" target="_self">26</a><div class="footnote-content"><p>If the subway had been built much earlier, it likely would have been steam powered. Ventilation for coal burning locomotives in tunnels was difficult, but it had been done on London&#8217;s subway.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Commonwealth Fusion Systems is the Same Scale as OpenAI]]></title><description><![CDATA[Artificial intelligence is not the only exciting emerging technology.]]></description><link>https://blog.aiimpacts.org/p/commonwealth-fusion-systems-is-the</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/commonwealth-fusion-systems-is-the</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Fri, 12 Jan 2024 21:43:13 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!S6zY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Artificial intelligence is not the only exciting emerging technology. Another one that I am personally familiar with is fusion.</p><p>It seems interesting to compare companies working on emerging technologies to get a feel for how surprising developments in these companies are. Some startups working on developing AI have grown rapidly, but how surprising is this growth?</p><p>The leading startup for AI is OpenAI. The leading startup for fusion is Commonwealth Fusions Systems (CFS). Here is some basic information for these two companies:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!S6zY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!S6zY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 424w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 848w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 1272w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!S6zY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png" width="1168" height="517" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:517,&quot;width&quot;:1168,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:79281,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!S6zY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 424w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 848w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 1272w, https://substackcdn.com/image/fetch/$s_!S6zY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22a2fbc8-7112-4d3a-a4a2-95ece6e77dd5_1168x517.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>OpenAI was founded 2-3 years before Commonwealth Fusion Systems.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> Other than that, their trajectories have been surprisingly similar.</p><p>OpenAI currently has 770 employees.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> Commonwealth Fusion Systems currently has &#8220;&gt;600 employees and &gt;100 contractors.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a></p><p>OpenAI has received about 5 times as much funding as CFS. Most of this money came from a single investment by Microsoft in 2023.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> If we look at OpenAI from 2-3 years ago to control for the different founding dates, then OpenAI had raised less money than Commonwealth Fusion Systems. It would not surprise me if CFS gets significantly more investment in the next several years as well.</p><p>The $10B investment in OpenAI occurred after the launch of ChatGPT, 7 years after the company was founded. Commonwealth Fusion Systems&#8217; first major project is SPARC, a tokamak experiment which should be completed in 2025, also 7 years after the company was founded. The goal of SPARC is to demonstrate Q&gt;1, or net energy gain from the plasma,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> as quickly as possible, and then work up to demonstrating Q~10.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> Fusion projects are notorious for falling behind schedule,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> but CFS has maintained its schedule so far and construction is well underway. It is not certain if SPARC will succeed in demonstrating Q&gt;1 in 2025 or 2026, but if they do, they will likely see a significant increase in hype - and funding.&nbsp;</p><p>OpenAI was definitely not making a profit as recently as 2022.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> Their revenue has increased dramatically since then,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> so they might be making a profit now. Commonwealth Fusion Systems is definitely not making a profit now.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Mz8t!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Mz8t!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Mz8t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png" width="1200" height="742" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:742,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:&quot;Chart&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="Chart" srcset="https://substackcdn.com/image/fetch/$s_!Mz8t!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!Mz8t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7713fdd-e2f8-414f-8438-50a976b6ea1b_1200x742.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><strong>Figure 1: </strong>The funding history for Commonwealth Fusion Systems and OpenAI look similar, once you account for OpenAI being several years older than CFS. Data and sources can be found in <a href="https://docs.google.com/spreadsheets/d/1s2dTYBuUp_gQ4uQ9PKhMJjaJChmtTyfmhh4EU_iDjtw/edit?usp=sharing">this spreadsheet</a>.</figcaption></figure></div><p>I&#8217;m not entirely sure what conclusions should be drawn from this comparison. It seems easy to overestimate the relative importance of an industry that is dominant locally (either in your city or in your social network) and to be unaware of developments that are more distant from you. OpenAI and CFS are obviously not the only actors in AI&nbsp; and fusion, respectively, and I have not done a systematic comparison between the two fields. The technical details of these two fields are very different and so place different demands on their workforce and capital. Nevertheless, this comparison provides some evidence that investors&#8217; estimates of the value of AI and fusion are not wildly different.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p><em>Introducing OpenAI.</em> (December 11, 2015) <a href="https://openai.com/blog/introducing-openai">https://openai.com/blog/introducing-openai</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p><em>MIT launches multimillion-dollar collaboration to develop fusion energy. </em>(March 9, 2018) <a href="https://www.nature.com/articles/d41586-018-02966-3">https://www.nature.com/articles/d41586-018-02966-3</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Savov, Vance, &amp; Ludlow. <em>OpenAI Staff Near Total Mutiny With Threat to Join Microsoft. </em>Bloomberg. (November 20, 2023) <a href="https://news.bloomberglaw.com/us-law-week/openai-staff-threaten-to-go-to-microsoft-if-board-doesnt-quit">https://news.bloomberglaw.com/us-law-week/openai-staff-threaten-to-go-to-microsoft-if-board-doesnt-quit</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Dan Brunner. <em>The high magnetic field path to fusion energy. </em>65th Annual Meeting of the APS Division of Plasma Physics <strong>NO05:1</strong>. (November 1, 2023)</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Bass. <em>Microsoft Invests $10 Billion in ChatGPT Maker OpenAI. </em>Bloomberg. (January 23, 2023) <a href="https://www.bloomberg.com/news/articles/2023-01-23/microsoft-makes-multibillion-dollar-investment-in-openai">https://www.bloomberg.com/news/articles/2023-01-23/microsoft-makes-multibillion-dollar-investment-in-openai</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Q is the ratio of energy injected into the plasma to the energy produced by fusion. To accommodate inefficiencies in the rest of a fusion power plant, you would typically want to operate at Q&gt;5.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>Rodriguez-Fernandez et al. <em>Overview of the SPARC physics basis towards the exploration of burning-plasma regimes in high-field, compact tokamaks. </em>Nuclear Fusion <strong>62.4</strong>. (March 1, 2022) <a href="https://iopscience.iop.org/article/10.1088/1741-4326/ac1654">https://iopscience.iop.org/article/10.1088/1741-4326/ac1654</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Fusion projects are also often large international collaborations. Missing schedule and budget targets could be because of the institutional structure of many fusion experiments, rather than because of something inherent to fusion technology itself.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Erin Woo &amp; Amir Efrati. <em>OpenAI&#8217;s Losses Doubled to $540 Million as It Developed ChatGPT. </em>The Information. (May 4, 2023) <a href="https://www.theinformation.com/articles/openais-losses-doubled-to-540-million-as-it-developed-chatgpt">https://www.theinformation.com/articles/openais-losses-doubled-to-540-million-as-it-developed-chatgpt</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Amir Efrati. <em>OpenAI&#8217;s Revenue Crossed $1.3 Billion Annualized Rate, CEO Tells Staff. </em>The Information. (October 12, 2023) <a href="https://www.theinformation.com/articles/openais-revenue-crossed-1-3-billion-annualized-rate-ceo-tells-staff">https://www.theinformation.com/articles/openais-revenue-crossed-1-3-billion-annualized-rate-ceo-tells-staff</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Survey of 2,778 AI authors: six parts in pictures]]></title><description><![CDATA[The 2023 Expert Survey on Progress in AI is out, this time with 2778 participants from six top AI venues (up from about 700 and two in the 2022 ESPAI), making it probably the biggest ever survey of AI researchers.]]></description><link>https://blog.aiimpacts.org/p/2023-ai-survey-of-2778-six-things</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/2023-ai-survey-of-2778-six-things</guid><dc:creator><![CDATA[Katja Grace]]></dc:creator><pubDate>Thu, 04 Jan 2024 08:19:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2J90!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The 2023 Expert Survey on Progress in AI is <a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">out</a>, this time with 2778 participants from six top AI venues (up from <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai#population">about 700</a> and two in the <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai">2022 ESPAI</a>), making it probably the biggest ever survey of AI researchers. </p><p>People answered in October, an eventful fourteen months after the 2022 survey, which had mostly identical questions for comparison.</p><p><a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">Here</a> is the preprint. And here are six interesting bits in pictures (with figure numbers matching paper, for ease of learning more):</p><ol><li><p><strong>Expected time to human-level performance dropped 1-5 decades since the 2022 survey. </strong>As always, our questions about &#8216;high level machine intelligence&#8217; (HLMI) and &#8216;full automation of labor&#8217; (FAOL) got very different answers, and individuals disagreed a lot (shown as thin lines below), but the aggregate forecasts for both sets of questions dropped sharply. For context, between 2016 and 2022 surveys, the forecast for HLMI had only shifted about a year.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2J90!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2J90!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 424w, https://substackcdn.com/image/fetch/$s_!2J90!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 848w, https://substackcdn.com/image/fetch/$s_!2J90!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 1272w, https://substackcdn.com/image/fetch/$s_!2J90!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2J90!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png" width="728" height="559" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1118,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:1108014,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!2J90!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 424w, https://substackcdn.com/image/fetch/$s_!2J90!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 848w, https://substackcdn.com/image/fetch/$s_!2J90!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 1272w, https://substackcdn.com/image/fetch/$s_!2J90!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01958ae7-cb8c-4340-a2e7-5b54c0ab5e7c_2076x1594.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 3)</figcaption></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6Vd3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6Vd3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 424w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 848w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 1272w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6Vd3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png" width="728" height="559" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1118,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:1011486,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6Vd3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 424w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 848w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 1272w, https://substackcdn.com/image/fetch/$s_!6Vd3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31afa88c-324e-4b6d-91ed-97e2715e5b95_2076x1594.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 4)</figcaption></figure></div><ol start="2"><li><p><strong>Time to most narrow milestones decreased, some by a lot.</strong> AI researchers are expected to be professionally fully automatable a quarter of a century earlier than in 2022, and NYT bestselling fiction dropped by more than half to ~2030. Within five years, AI systems are forecast to be feasible that can fully make a payment processing site from scratch, or entirely generate a new song that sounds like it&#8217;s by e.g. Taylor Swift, or autonomously download and fine-tune a large language model.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JM7j!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JM7j!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 424w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 848w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 1272w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JM7j!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png" width="1456" height="2271" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2271,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:457575,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JM7j!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 424w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 848w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 1272w, https://substackcdn.com/image/fetch/$s_!JM7j!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21f4fd7b-b6b7-4bf7-bb2d-f33901e2ffdc_2310x3603.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 2)</figcaption></figure></div></li><li><p><strong>Median respondents put 5% or more on advanced AI leading to human extinction or similar, and a third to a half of participants gave 10% or more. </strong>This was across four questions, one about overall value of the future and three more directly about extinction.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ckLm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ckLm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 424w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 848w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 1272w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ckLm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png" width="1456" height="729" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:729,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:240318,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ckLm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 424w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 848w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 1272w, https://substackcdn.com/image/fetch/$s_!ckLm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59b60985-66a9-4364-976b-5c6228eff6fa_2544x1274.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig. 14)</figcaption></figure></div><ol start="4"><li><p><strong>Many participants found many scenarios worthy of substantial concern over the next 30 years.</strong> For every one of eleven scenarios and &#8216;other&#8217; that we asked about, at least a third of participants considered it deserving of substantial or extreme concern.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!nGD9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!nGD9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 424w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 848w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 1272w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!nGD9!,w_2400,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png" width="1200" height="705.9360730593608" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;large&quot;,&quot;height&quot;:773,&quot;width&quot;:1314,&quot;resizeWidth&quot;:1200,&quot;bytes&quot;:64285,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-large" alt="" srcset="https://substackcdn.com/image/fetch/$s_!nGD9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 424w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 848w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 1272w, https://substackcdn.com/image/fetch/$s_!nGD9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82d27755-5476-4af5-ac32-a29fe0be5d41_1314x773.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 10)</figcaption></figure></div><ol start="5"><li><p><strong>There are few confident optimists or pessimists about advanced AI: high hopes and dire concerns are usually found together.</strong> 68% of participants who thought HLMI was more likely to lead to good outcomes than bad, but nearly half of these people put at least 5% on extremely bad outcomes such as human extinction, and 59% of net pessimists gave 5% or more to extremely good outcomes.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vrk3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vrk3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 424w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 848w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 1272w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vrk3!,w_2400,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png" width="1200" height="600.8241758241758" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;large&quot;,&quot;height&quot;:729,&quot;width&quot;:1456,&quot;resizeWidth&quot;:1200,&quot;bytes&quot;:327864,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-large" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vrk3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 424w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 848w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 1272w, https://substackcdn.com/image/fetch/$s_!vrk3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce8dfb8d-83ec-453b-8dd2-3609565338b7_2544x1274.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 11: a random 800 responses as vertical bars, higher definition below)</figcaption></figure></div><div class="file-embed-wrapper" data-component-name="FileToDOM"><div class="file-embed-container-reader"><div class="file-embed-container-top"><image class="file-embed-thumbnail-default" src="https://substackcdn.com/image/fetch/$s_!0Cy0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Fattachment_icon.svg"></image><div class="file-embed-details"><div class="file-embed-details-h1">Future Value 800</div><div class="file-embed-details-h2">58.1KB &#8729; PDF file</div></div><a class="file-embed-button wide" href="https://blog.aiimpacts.org/api/v1/file/29c9bfb8-5d57-4b5f-9ffe-e50691268b4d.pdf"><span class="file-embed-button-text">Download</span></a></div><a class="file-embed-button narrow" href="https://blog.aiimpacts.org/api/v1/file/29c9bfb8-5d57-4b5f-9ffe-e50691268b4d.pdf"><span class="file-embed-button-text">Download</span></a></div></div><ol start="6"><li><p><strong>70% of participants would like to see research aimed at minimizing risks of AI systems be prioritized more highly.</strong> This is much like 2022, and in both years a third of participants asked for &#8220;much more&#8221;&#8212;more than doubling since 2016.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9q_Q!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9q_Q!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 424w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 848w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9q_Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png" width="1456" height="882" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:882,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:615140,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9q_Q!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 424w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 848w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!9q_Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8e6a2c4-a549-4501-a242-9b3d2d3eddfa_2114x1280.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(Fig 15)</figcaption></figure></div><p>If you enjoyed this, <a href="https://aiimpacts.org/wp-content/uploads/2023/04/Thousands_of_AI_authors_on_the_future_of_AI.pdf">the paper</a> covers many other questions, as well as more details on the above. What makes AI progress go? Has it sped up? Would it be better if it were slower or faster? What will AI systems be like in 2043? Will we be able to know the reasons for its choices before then? Do people from academia and industry have different views? Are concerns about AI due to misunderstandings of AI research? Do people who completed undergraduate study in Asia put higher chances on extinction from AI than those who studied in America? Is the &#8216;alignment problem&#8217; worth working on?</p><p></p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/p/2023-ai-survey-of-2778-six-things?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/p/2023-ai-survey-of-2778-six-things?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[When scientists consider whether their research will end the world]]></title><description><![CDATA[Five examples and what we can take away from them]]></description><link>https://blog.aiimpacts.org/p/when-scientists-consider-whether</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/when-scientists-consider-whether</guid><dc:creator><![CDATA[Harlan Stewart]]></dc:creator><pubDate>Tue, 19 Dec 2023 03:07:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Although there is currently no consensus about the future impact of AI, <a href="https://aiimpacts.org/how-bad-a-future-do-ml-researchers-expect/">many</a> experts are <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/views_of_ai_developers_on_risk_from_ai">warning</a> that the technology&#8217;s continued progress could lead to catastrophic outcomes such as human <a href="https://www.safe.ai/statement-on-ai-risk">extinction</a>. Interestingly, some experts who give this warning are those who work or previously worked at the labs actively developing frontier AI systems. This unusual situation raises questions that humanity has little experience with. If a scientific project might harm the entire world, who should decide whether the project will proceed? How much risk should decision-makers accept for outcomes such as human extinction?<br><br>While it is unusual for experts to consider whether their research will lead to catastrophic outcomes, it is also not completely unprecedented. Below, I will discuss the few relevant historical examples I could find, followed by some possible takeaways.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><h1>Examples</h1><h2>1942-1945: Manhattan Project scientists considered whether a single atom bomb would set the entire planet on fire.</h2><p>During the Manhattan Project, <a href="https://www.insidescience.org/manhattan-project-legacy/atmosphere-on-fire">some of the scientists became concerned</a> that the atomic bomb they were building might generate enough heat to set off a chain reaction that would ignite the atmosphere and quickly end all life on Earth. Unlike many risks from new technologies, this one could not be directly evaluated through experimentation because there would be no survivors to observe a negative outcome of the experiment. Fortunately, the relevant physics was understood well enough to take a decent shot at evaluating the risk with theory and calculation alone. And even more fortunately, the calculations showed that atmospheric ignition was very unlikely&#8211; or at least unlikely enough that the scientists proceeded to detonate the first atomic bomb for the Trinity test.</p><p>The historical accounts differ about exactly how small the scientists believed the risk to be and what amount of risk they deemed acceptable:</p><blockquote><p><em>If, after calculation, [Arthur Compton] said, it were proved that the chances were more than <strong>approximately three in one million</strong> that the earth would be vaporized by the atomic explosion, he would not proceed with the project. Calculation proved the figures slightly less -- and the project continued.</em><br>-Pearl S. Buck, in 1959, recalled a conversation she had with Arthur Compton, the leader of the Metallurgical Laboratory during the Manhattan Project.</p></blockquote><blockquote><p><em>There was never &#8220;a probability of slightly less than three parts in a million,&#8221;... Ignition is not a matter of probabilities; it is <strong>simply impossible</strong>.<br></em>-Hans Bethe, the leader of the T Division during the Manhattan Project, in 1976</p></blockquote><p>Arthur Compton never disputed Buck&#8217;s account,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> so Compton probably did decide that a 3 x 10<sup>-6</sup> (three-in-a-million) chance of extinction was acceptable. Although Hans Bethe&#8217;s contrasting account shows some disagreement about the likelihood of the risk, the only documented attempt to decide an acceptable limit for the risk is Compton&#8217;s.</p><p>It&#8217;s hard to say whether it was reasonable for Compton to accept a 3x10<sup>-6</sup> chance of atmospheric ignition. It&#8217;s a small number, in the same order of magnitude as one&#8217;s <a href="https://en.wikipedia.org/wiki/Micromort#Leisure_and_sport">chance of dying from a single skydiving jump</a>. But when it comes to setting the earth on fire, how small is small enough? One way to weigh the risk is by finding the expected number of deaths it equates to, but the answer varies depending on the assumptions used:</p><ul><li><p>If the Manhattan Project scientists only gave moral value to humans who were alive in their time, then they could have multiplied the 1945 world population of <a href="https://www.atlasofhumanity.com/humanpopulationthroughtime">~2.3 billion</a> by Compton&#8217;s threshold probability of 3x10<sup>-6</sup> for an expected <strong>7000 casualties</strong>.</p></li><li><p>As someone who lives &#8220;in the future&#8221; from the perspective of 1945, I would strongly prefer for them to have also placed moral value on humans living after their time. <a href="https://globalprioritiesinstitute.org/wp-content/uploads/Toby-Newberry_How-many-lives-does-the-future-hold.pdf">One analysis</a> estimates that there could be between 10<sup>13 </sup>and 10<sup>54</sup> future humans, so a 3x10<sup>-6</sup> chance of extinction would be equivalent to <strong>somewhere between 30 million and 3x10<sup>48</sup> casualties</strong>.</p></li></ul><p>A three-in-a-million chance of human extinction is clearly a serious risk, but it may have been reasonable given how gravely important Compton and his colleagues believed the Manhattan Project to be. In his interview with Buck, Compton made it clear that he considered the stakes to be extremely high, saying that it would have been &#8220;better to accept the slavery of the Nazis than to run the chance of drawing the final curtain on mankind.&#8221;</p><p>Because of the secrecy demanded by the Manhattan Project, the public had no knowledge of the risk of atmospheric ignition until many years later. The assessment of the risk and the decision about how to handle it was likely made by a relatively small number of scientists and government officials.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-G4N!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-G4N!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-G4N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg" width="512" height="408.2637362637363" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1161,&quot;width&quot;:1456,&quot;resizeWidth&quot;:512,&quot;bytes&quot;:455593,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-G4N!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-G4N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e64a55-0e18-4de8-b0f3-7a729b85a6a4_2048x1633.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>&#8220;Hans Albrecht Bethe (1906-2005) being interviewed by journalists.&#8221; -</em><a href="https://www.flickr.com/photos/25053835@N03/4729461869">Smithsonian Institution</a></figcaption></figure></div><h2>1973-1975: Biologists considered whether recombinant DNA research would create deadly pathogens.</h2><p>Methods for genetic recombination quickly progressed in the early 1970s, making it easier to combine the DNA of multiple organisms and <a href="https://intelligence.org/files/TheAsilomarConference.pdf">causing some biologists to become concerned</a> that their research could create deadly new pathogens. In 1972, Biochemist Paul Berg and his colleagues combined the DNA of a cancer-causing virus with the DNA of the <em>E. Coli </em>bacterium, which lives in the human gut. Berg planned to insert this recombinant DNA back into <em>E. Coli </em>bacteria but was convinced by his colleagues to halt the experiment over fears that the altered bacteria might get out of the lab and harm the world.</p><p>Prompted by these concerns, the National Academy of Sciences (NAS) formed a committee chaired by Berg to evaluate the risk. In 1974, the committee asked for a moratorium on certain types of recombinant DNA experiments. Despite disagreement among biologists, the moratorium was adhered to.</p><p>In 1975, scientists, lawyers, and policymakers met for the Asilomar Conference on Recombinant DNA Molecules to determine whether to lift the moratorium. Twelve journalists were also selected for invitation to the conference, <a href="https://www.youtube.com/watch?v=C-xKuICTMpY">possibly</a> due to fears about cover-up accusations in the wake of the recent Watergate scandal. The conference ultimately <a href="https://www.pnas.org/doi/pdf/10.1073/pnas.72.6.1981">concluded</a> that most recombinant DNA research should be allowed to proceed under strict guidelines but that experiments involving highly pathogenic organisms or toxic genes should be forbidden. The guidelines were adopted by the National Institutes of Health as a requirement for funding.</p><p>Unlike the physicists at the Manhattan Project, biologists in the 1970s did not have the theoretical understanding to confidently evaluate the risk and build a consensus around it. It&#8217;s unclear how likely or severe they thought the risks from recombinant DNA were. In 2015, Berg said that &#8220;if you sampled our true feelings, the members of the committee believed the experiments probably had <strong>little or no risk, but &#8230; nobody could say zero risk</strong>.&#8221; In the decades following the conference, recombinant DNA experiments were mostly safe, and many of the guidelines were gradually scaled back over time. In 2007, Berg said that they &#8220;overestimated the risks, but [they] had no data as a basis for deciding, and it was sensible to choose the prudent approach.&#8221;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!b0pe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!b0pe!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 424w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 848w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!b0pe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg" width="532" height="353.3269230769231" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:967,&quot;width&quot;:1456,&quot;resizeWidth&quot;:532,&quot;bytes&quot;:685082,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!b0pe!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 424w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 848w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!b0pe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e7557fe-481f-4794-9d3a-5dd3b9ec2691_3472x2306.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>&#8220;Dr. Maxine Singer, Dr. Norton Zinder, Dr. Sydney Brenner, and Dr. Paul Berg at the Asilomar Conference on Recombinant DNA</em>&#8221; -<em> </em><a href="https://collections.nlm.nih.gov/catalog/nlm:nlmuid-101441127-img#">National Library of Medicine</a>.</figcaption></figure></div><h2>1999: Physicists considered whether heavy-ion collider experiments would destroy the planet.</h2><p>In 1999, the year before the Relativistic Heavy Ion Collider (RHIC) began operating at Brookhaven National Laboratory (BNL), members of the public became concerned by media reports about a chance that the collider could create miniature black holes that would destroy the earth. In response to these concerns, the director of the BNL convened a committee of physicists to write a report addressing the risks.</p><p>The <a href="https://arxiv.org/pdf/hep-ph/9910333.pdf">report, authored by Busza et al.,</a> uses a combination of theory and empirical evidence to demonstrate that the RHIC was very unlikely to cause a catastrophe. The report evaluates three types of speculated catastrophic risks from RHIC but mostly focuses on the risk of dangerous &#8216;strangelets.&#8217;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> In the report, the authors say that theoretical arguments related to strangelet formation are enough on their own to &#8220;exclude any safety problem at RHIC confidently.&#8221; They also make an empirical argument based on the observation that the moon has not been destroyed despite constant bombardment by cosmic rays. Based on that empirical evidence, they derive an upper bound for the probability that ranges from <strong>10<sup>&#8722;5</sup> </strong>(one in ten thousand) <strong>to 2&#215;10<sup>&#8722;11</sup></strong> (two in one hundred billion) depending on assumptions made.</p><p>The authors of <a href="https://cds.cern.ch/record/405010/files/9910471.pdf">another report, authored by Dar et al. </a>and published by CERN, used a similar empirical argument about the frequency of supernovae to calculate an upper bound of <strong>2x10<sup>-8</sup> </strong>(two in a hundred million) for the likelihood of dangerous strangelets being produced at RHIC.</p><p>What did these physicists believe about the limit of acceptability for risks of this severity? Dar et al. describe their bound of 2x10<sup>-8</sup> as &#8220;a safe and stringent upper bound.&#8221; In the <a href="https://arxiv.org/pdf/hep-ph/9910333v1.pdf">first version</a> of their paper, Busza et al. described their bounds as &#8220;a comfortable margin of error,&#8221; but in the final version, they instead say that &#8220;We do not attempt to decide what is an acceptable upper limit on <em>p</em>, nor do we attempt a &#8216;risk analysis,&#8217; weighing the probability of an adverse event against the severity of its consequences.&#8221;</p><p>The following year, physicist Adrian Kent released a <a href="https://arxiv.org/pdf/hep-ph/0009204v6.pdf">paper</a> criticizing Dar et al. and Busza et al. for a lack of nuance in their assessment of the risks. Kent points out that, even without considering future lives, Dar et al.&#8217;s &#8220;safe and stringent upper bound&#8221; of 10<sup>-8</sup> would imply <strong>120 casualties</strong> in expectation over the course of ten years. He says that &#8220;despite the benefits of RHIC, the experiment would not be allowed to proceed if it were certain (say, because of some radiation hazard) to cause precisely 120 deaths among the population at large.&#8221;</p><p>Kent uses a back-of-the-envelope calculation involving established risk tolerances for radiation hazards to propose that the acceptable upper bound for risk of extinction should be <strong>10<sup>-15</sup></strong>(one in a quadrillion) per year if not accounting for future lives and <strong>10<sup>-22</sup></strong>(one in ten billion trillion) per year if accounting for future lives.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> He also argues that the acceptable risk bounds for a project should be decided further in advance, and agreed on by experts that aren&#8217;t actively involved in the project.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dEOh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dEOh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dEOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg" width="528" height="402.5274725274725" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1110,&quot;width&quot;:1456,&quot;resizeWidth&quot;:528,&quot;bytes&quot;:1237219,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dEOh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dEOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2bd1d656-9fdb-470c-ac20-ca5c7987317d_2047x1560.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>&#8220;A view of the superconducting magnets at Brookhaven's Relativistic Heavy Ion Collider. As gold particles zip along the collider's 2.4 mile long tunnel at nearly the speed of light, 1,740 of these magnets guide and focus the particle beams.&#8221; - </em><a href="https://www.flickr.com/photos/brookhavenlab/3112766443/in/photostream/">Brookhaven National Laboratory</a></figcaption></figure></div><h2>2005-?: The SETI community considers whether messages sent to space will invite the attention of hostile extraterrestrials.</h2><p>The search for extraterrestrial intelligence (SETI) involves monitoring the cosmos for signals of extraterrestrial life. Around 2005, some members of the SETI community became enthusiastic about the idea of &#8220;Active SETI,&#8221; which refers to intentionally sending messages to the cosmos. This sparked an intense debate within the community- <a href="https://phys.org/news/2015-02-controversy-interstellar-messaging.html">some were concerned</a> that Active SETI could endanger humanity by attracting the attention of hostile extraterrestrials. A 2006 <a href="https://www.nature.com/articles/443606a">article in Nature</a> argued that these were &#8220;<strong>small risks</strong>&#8221; that &#8220;should nevertheless be taken seriously.&#8221; Supporters of Active SETI argued that extraterrestrial civilizations could likely detect radio signals from Earth regardless of whether messages with more powerful signals were sent.</p><p>Science fiction author David Brin was one of the concerned figures in the SETI community. In 2006, Brin <a href="https://lifeboat.com/ex/shouting.at.the.cosmos">wrote</a> that he and others in the community had called for a conference to discuss the risks of Active SETI but that their concerns had been largely ignored by the rest of the community. Frustrated by what he perceived as a failure of Active SETI supporters to consult with their colleagues,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> Brin considered making the issue more public by contacting journalists with the story. Brin wrote that he preferred the &#8220;collegiate approach,&#8221; though, because public attention on the issue could damage SETI's reputation as a whole, and it&#8217;s unclear if he ever attempted to bring the story to journalists.</p><p>A permanent committee in the International Academy of Astronautics (IAA) is the closest thing to a regulatory body within the SETI community. In 2007, this committee <a href="http://resources.iaaseti.org/position.pdf">drafted</a> new principles regarding sending messages to Extraterrestrials, including that the decision of whether to do so &#8220;should be made by an appropriate international body, broadly representative of Humankind.&#8221; However, these principles were never adopted.</p><p>The debate over Active SETI was renewed in 2010. Stephen Hawking made <a href="https://www.nbcnews.com/id/wbna36769422">headlines</a> by saying that it was a bad idea. At a two-day conference hosted by the Royal Society, members of the SETI community had a heated debate but did not reach a consensus. At its annual meeting, the IAA SETI committee updated its <a href="http://resources.iaaseti.org/protocols_rev2010.pdf">Declaration of Principles</a> for the first time in over 20 years, but the updated principles still made no mention of Active SETI.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a></p><p>In 2015, the community debated the issue again at <a href="https://www.bbc.com/news/science-environment-31442952">another conference</a> with no resolution. That same year, 28 scientists and business leaders, including Elon Musk, signed a <a href="https://setiathome.berkeley.edu/meti_statement_0.html">statement</a> calling for &#8220;a worldwide scientific, political and humanitarian discussion&#8221; before continuing Active SETI, with an emphasis on the uncertainties that surround the existence, capabilities, and intentions of potential extraterrestrial intelligence.</p><p>The debate about whether the risks of Active SETI are acceptable appears to still be unresolved, and it&#8217;s unclear if the two sides of the debate are still discussing it with each other. The most recent transmission listed on the <a href="https://en.wikipedia.org/wiki/Active_SETI">Active SETI Wikipedia page</a> was sent in 2017.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!UW_9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!UW_9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!UW_9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg" width="294" height="441" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1200,&quot;width&quot;:800,&quot;resizeWidth&quot;:294,&quot;bytes&quot;:113303,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!UW_9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UW_9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F306350cf-3513-400e-ab99-ed4c06c8cbc0_800x1200.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>In 2008, the Yevpatoria RT-70 radio telescope (pictured above) sent &#8220;<a href="https://www.telegraph.co.uk/news/newstopics/howaboutthat/3166709/Messages-from-Earth-sent-to-distant-planet-by-Bebo.html">A Message From Earth</a>&#8221;</em> <em>to Gliese 581 c. The message will arrive at its destination in 2029.</em> (Photo by <a href="https://commons.wikimedia.org/wiki/File:70-%D0%BC_%D0%B0%D0%BD%D1%82%D0%B5%D0%BD%D0%BD%D0%B0_%D0%9F-2500_(%D0%A0%D0%A2-70).jpg">S. Korotkiy</a>)</figcaption></figure></div><h2>1951-present: Computer scientists consider whether a sufficiently powerful misaligned AI system will escape containment and end life on Earth.</h2><p>Concern about the impact of powerful AI systems dates back to the beginning of modern computer science:</p><blockquote><p><em>Let us now assume, for the sake of argument, that [intelligent] machines are a genuine possibility, and look at the consequences of constructing them... There would be no question of the machines dying, and they would be able to converse with each other to sharpen their wits. At some stage therefore we should have to expect the machines to take control, in the way that is mentioned in Samuel Butler's Erewhon.<br></em>-Foundational computer scientist Alan Turing <a href="https://rauterberg.employee.id.tue.nl/lecturenotes/DDM110%20CAS/Turing/Turing-1951%20Intelligent%20Machinery-a%20Heretical%20Theory.pdf">in 1951</a>.</p></blockquote><p>Today, many experts are concerned about the <a href="https://www.vox.com/future-perfect/2018/12/21/18126576/ai-artificial-intelligence-machine-learning-safety-alignment">risk</a> that there may eventually be an AI system that is much more capable than humans with goals that are not aligned with those of humans. When pursuing its goals, such a system might cause human extinction through its efforts to acquire resources or survive.</p><p>Unlike the scientists at the Manhattan Project or Brookhaven National Laboratory, AI researchers have no agreed-upon method for calculating the amount of extinction risk. Theoretical understanding of the nature of intelligence does not yet have the strong foundation seen in fields such as nuclear physics. Empirical data is limited because no human has ever interacted with an intelligence more capable than all humans.</p><p>Expert opinions vary widely about the amount of risk posed by powerful AI systems:</p><ul><li><p>Decision theorist <a href="https://intelligence.org/2022/06/10/agi-ruin/">Eliezer Yudkowsky</a> argues that, with our current techniques, a powerful AI would be &#8220;<strong>roughly certain</strong> to kill everybody.&#8221;</p></li><li><p>Former OpenAI researcher <a href="https://www.alignmentforum.org/posts/Hw26MrLuhGWH7kBLm/ai-alignment-is-distinct-from-its-near-term-applications">Paul Christiano</a> believes that the total risk of extinction from AI is <strong>10-20%.</strong></p></li><li><p>Two researchers who describe themselves as <a href="https://optimists.ai/2023/11/28/ai-is-easy-to-control/">AI optimists</a> argue that &#8220;a catastrophic AI takeover is roughly <strong>1% likely</strong>.&#8221;</p></li><li><p>In a <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai">recent survey of AI researchers</a>, the median researcher gave a <strong>5-10% chance</strong><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> of catastrophic outcomes such as human extinction.</p></li></ul><p>Although it is common to see estimates of the extinction risk from AI, there seem to be few attempts to set an acceptable upper limit for it. Theoretical computer scientist Scott Aaronson <a href="https://scottaaronson.blog/?p=7042">wrote</a> that his limit &#8220;might be as high as&#8221; <strong>2% </strong>if the upside would be that we &#8220;learn the answers to all of humanity&#8217;s greatest questions.&#8221; Even without taking future lives into account, a 2% extinction risk is equivalent to around <strong>160 million casualties </strong>in expectation<strong>, </strong>roughly four times the population of Canada.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> It&#8217;s difficult to say whether the potential benefits of powerful AI systems would justify taking that relatively high risk.</p><p>Citing the scientific uncertainty about the future outcomes of AI development, an <a href="https://futureoflife.org/open-letter/pause-giant-ai-experiments/">open letter </a>from earlier this year called for a moratorium on frontier AI development. The letter was signed by some prominent experts, including Yoshua Bengio, a Turing Prize winner, and Stuart Russell, co-author of the standard textbook &#8220;Artificial Intelligence: a Modern Approach.&#8221; As of this writing, no moratorium seems to have taken place&#8211; just two weeks ago, Google <a href="https://blog.google/technology/ai/google-gemini-ai/">announced</a> the release of its &#8220;largest and most capable AI model.&#8221;</p><p>Although discussions about extinction risks from AI have historically been rare outside of a relatively small research community, many policymakers, journalists, and members of the public have recently become more involved. Last month, the <a href="https://www.theguardian.com/technology/2023/nov/02/five-takeaways-uk-ai-safety-summit-bletchley-park-rishi-sunak">AI Safety Summit </a>in the UK brought experts and world leaders together to discuss risks from AI and how to mitigate them.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Pdcq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Pdcq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Pdcq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg" width="1024" height="683" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:683,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:122306,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Pdcq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Pdcq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc6f02cb9-6b0c-45f1-9f5a-908706625f90_1024x683.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>&#8220;[Nov 1, 2023]. Bletchley, United Kingdom. Delegates attend the Risks from Loss of Control over Frontier AI roundtable on day one of the UK AI Summit at Bletchley Park. Picture by Kirsty O'Connor / No 10 Downing Street.</em>&#8221; -<a href="https://www.flickr.com/photos/ukgov/53302343301/in/photostream/">UK Government</a></figcaption></figure></div><h1>Takeaways</h1><h3>Scientists sometimes disagree about the acceptable upper limit for extinction risks but usually agree that it should be extremely small. </h3><p>For three cases, I found examples of scientists suggesting an acceptable limit for risk of extinction:</p><ul><li><p>A Manhattan Project scientist set his upper limit at 3x10<sup>-6</sup> (three in a million).</p></li><li><p>Scientists at CERN considered a 2x10<sup>-8</sup> (two in a hundred million) risk to be acceptable for RHIC, but another physicist argued that a more appropriate limit would be 10<sup>-15</sup> (one in a quadrillion) or 10<sup>-22</sup> (one in ten billion trillion).</p></li><li><p>One prominent computer scientist might accept as much as a 2x10<sup>-2</sup> (two in a hundred) chance of extinction from powerful AI.</p></li></ul><p>Although these upper limits vary widely, they all fall below the 5x10<sup>-2</sup> (five in a hundred) to 10<sup>-1 </sup>(one in ten) odds the median expert gave for extinction from AI last year, and usually by many orders of magnitude. Notably, this is true even about the upper limit given by a scientist who reasonably believed that stopping his research might lead to Nazis ruling the world.</p><h3><strong>When there is significant uncertainty about the risks, it is common to request a moratorium. </strong></h3><p>In each of the three cases where there was no clear consensus about the amount of risk, some portion of the scientific community called for a moratorium to discuss the issue and collect more evidence:</p><ul><li><p>In the case of recombinant DNA, the moratorium was adhered to by the entire community despite disagreements. A conference established safe guidelines for some research and a prohibition on the most dangerous types of research. Over time, these guidelines were relaxed as recombinant DNA was shown to be mostly safe.</p></li><li><p>In the case of Active SETI, concerned community members who called for a moratorium and conference were initially dismissed. Later, there were conferences to discuss the issue, but the community never agreed on a set of guidelines.</p></li><li><p>In the case of frontier AI systems, only in the last few months has a moratorium been earnestly requested. No moratorium has yet occurred, but there is a growing discussion among researchers, policymakers, and the public about the potential risks and policies to address them.</p></li></ul><h3>By default, scientists prefer keeping discussions of extinction risk within their community. </h3><p>All five cases contain some degree of interaction between the scientific community and policymakers or the public:</p><ul><li><p>During the Manhattan Project, there was no public knowledge of the risk of atmospheric ignition, but the situation involved interaction between scientists and government officials. One of the scientists <a href="http://large.stanford.edu/courses/2015/ph241/chung1/">complained</a> that the risk &#8220;somehow got into a document that went to Washington. So every once in a while after that, someone happened to notice it, and then back down the ladder came the question, and the thing never was laid to rest.&#8221;</p></li><li><p>At the Asilomar Conference on Recombinant DNA, 12 journalists were invited, but possibly only out of a fear that the scientists would be accused of a cover-up otherwise.</p></li><li><p>Risk assessments for large-ion collider experiments were published only a year before RHIC began to operate, and seemingly only because of unexpected media attention.</p></li><li><p>A concerned member of the SETI community considered bringing the issue of Active SETI to journalists but said that he preferred to discuss the issue within the community.</p></li><li><p>Discussion of risks from AI has recently grown among policymakers and the public. Previously, there was relatively little discussion of the risks outside the research community.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p></li></ul><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Arthur Compton <a href="https://oac.cdlib.org/view?docId=hb0580022s;NAAN=13030&amp;doc.view=frames&amp;chunk.id=div00003&amp;toc.depth=1&amp;toc.id=&amp;brand=oac4">died in 1962</a>, three years after the article was published in <em>The American Weekly</em>. He was delivering lectures in the month before his death.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>The other two risks addressed by the paper are gravitational singularities and vacuum instability. The authors use a theoretical argument to show that gravitational singularities are very unlikely but do not estimate a bound on the probability. For the vacuum instability scenario, they cite earlier work which argues that cosmic ray collisions have occurred many times in Earth&#8217;s past lightcone, and the fact that we exist is empirical evidence that such collisions are safe. &#8220;On empirical grounds alone, the probability of a vacuum transition at RHIC is bounded by <strong>2&#215;10<sup>&#8722;36</sup></strong>.&#8221; However, I am personally skeptical of this argument because it seems to ignore the effect of <a href="https://nickbostrom.com/papers/anthropicshadow.pdf">anthropic shadow</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Kent&#8217;s estimate of future lives is highly conservative compared to some other estimates- his calculation assumes only that the human population will hold constant at 10 billion until the earth is consumed by the sun.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Kent said that &#8220;future policy on catastrophe risks would be more rational, and more deserving of public trust, if acceptable risk bounds were generally agreed ahead of time and if serious research on whether those bounds could indeed be guaranteed was carried out well in advance of any hypothetically risky experiment, with the relevant debates involving experts with no stake in the experiments under consideration.&#8221;</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>&#8220;<em>...this would seem to be one more example of small groups blithely assuming that they know better. Better than the masses. Better than sovereign institutions. Better than all of their colleagues and peers. So much better &#8212; with perfect and serene confidence &#8212; that they are willing to bet all of human posterity upon their correct set of assumptions.&#8221;</em> -David Brin</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Although the declaration says nothing about actively sending messages, it does contain a principle about responding to messages: <em>&#8220;In the case of the confirmed detection of a signal, signatories to this declaration will not respond without first seeking guidance and consent of a broadly representative international body, such as the United Nations.&#8221;</em></p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>The median response is represented here as a range because it varied based on question framing.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>If we do take <a href="https://globalprioritiesinstitute.org/how-many-lives-does-the-future-hold-toby-newberry-future-of-humanity-institute-university-of-oxford/">future lives</a> into account, it might be equivalent to expected casualties ranging from 200 billion<sup> </sup>to 2x10<sup>52</sup> (more than the <a href="https://sciencenotes.org/how-many-atoms-are-in-the-world/">number of atoms</a> on earth).</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Are There Examples of Overhang for Other Technologies?]]></title><description><![CDATA[No.]]></description><link>https://blog.aiimpacts.org/p/are-there-examples-of-overhang-for</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/are-there-examples-of-overhang-for</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Wed, 13 Dec 2023 21:44:08 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>TL;DR:</strong> No.</p><h2>What Do I Mean By &#8216;Overhang&#8217;?</h2><h3>Hardware Overhang for AI</h3><p>One major concern about pausing AI development, from a purely safety perspective, is the possibility of hardware overhang.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Here is the concern as I understand it:</p><p>Suppose a policy were put in place tomorrow that banned all progress in AI capabilities anywhere in the world for the next five years.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> Afterwards, the ban would be completely lifted.&nbsp;</p><p>Hardware would continue to progress during this AI pause. Immediately after the pause ended, it would be possible to train new AI systems using significantly more compute than was previously possible, taking advantage of the improved hardware. There would be a period of extremely rapid growth, or perhaps a discontinuity,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> until the capabilities returned to their previous trend. Figure 1 shows a sketch of what we might expect progress to look like.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SBJ3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SBJ3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SBJ3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png" width="1200" height="742" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:742,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!SBJ3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!SBJ3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35e06dc3-9f9b-4a37-8ff5-528bb6e9e9bc_1200x742.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 1: </strong>What AI progress might look like if there were a temporary pause in capabilities progress. The &#8216;overhang&#8217; is the difference between what AI capabilities currently are as a result of the pause and what AI capabilities could be if the pause had never been enacted, or were completely lifted.</p></blockquote><p>It might be worse for safety to have a pause followed by extremely rapid growth in capabilities than to have steady growth in capabilities over the entire time frame. AI safety researchers would have less time to work with cutting edge models. During the pause, society would have less time to become accustomed to a given level of capabilities before new capabilities appeared, and society might continue to lag behind for some time afterwards.</p><p>If we knew that there would be catch-up growth after a pause, it might be better to not pause AI capabilities research now and instead hope that AI remains compute constrained so progress is as smooth as possible.</p><p>We do not know if there would be extremely rapid growth after a pause. To better understand how likely hardware overhang would be, I tried to find examples of hardware-overhang-like-things for other technologies.</p><h3>Overhang for Other Technologies</h3><p>Many technologies have an extremely important input - like GPUs/TPUs for AI, or engines for vehicles, or steel for large structures. Progress for these technologies can either come from improvements in the design of the technology itself or it can come from progress in the input which makes it easier to improve the technology. For AI, this is the distinction between algorithmic progress and hardware progress.</p><p>I am being purposefully vague about what &#8216;progress&#8217; and &#8216;input&#8217; mean here. Progress could be in terms of average cost, quantity produced, or some metric specific to that technology. The input is often something very particular to that technology, although I would also consider the general industrial capacity of society as an input. The definition is flexible to include as many hardware-overhang-like-things as possible.</p><p>It is possible for there to be a pause in progress for the technology itself, perhaps due to regulation or war, without there being a pause in progress for the inputs. The pause should be exogenous: it is a less interesting analogy for AI policy if further progress became more difficult for technical reasons particular to that technology.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> It is possible for AI progress to pause because of technical details about how hard it is to improve capabilities, and then for a new paradigm to see rapid growth, but this is a different concern than overhang due to AI policy. Exogenous pauses are cases where we might expect overhang to develop.</p><h2>Examples of Overhang</h2><h3>Methods</h3><p>To find examples of overhang, I looked in the data for our Discontinuous Progress Investigation<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> and in the Performance Curve Database.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> The Performance Curve Database contains 135 entries, most of which include both average cost and production quantity as a function of time. The Discontinuous Progress Investigation contains 21 case studies. The specific metric or metrics for each case study is different, but it should be a proxy for how &#8216;good&#8217; a particular technology is and it should have sufficient historical data to establish a trend. Most of these only include record performance on this metric. In total, I looked at several hundred graphs of some notion of progress for a technology.</p><p>I initially looked for data which had a similar shape to Figure 1: a clear trendline showing growth, followed by a pause, followed by rapid growth and a return to the previous trendline. If I found this pattern, I would then try to figure out the cause of the pause and the subsequent growth. If the pause was caused by an external factor, and the subsequent growth was driven by having better inputs, then this would be an example of overhang.&nbsp;</p><p>I also thought of technologies which were banned, and for which the ban was later completely reversed, to check whether they follow the pattern predicted by overhang.</p><h3>The Closest Example I Found: Land Speed Records</h3><p>Figure 2 shows the fastest any vehicle traveled on land over a distance of either 1 km or 1 mi from 1900 to 2000, according to Wikipedia.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Svsk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Svsk!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Svsk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png" width="1200" height="742" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:742,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Svsk!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 424w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 848w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 1272w, https://substackcdn.com/image/fetch/$s_!Svsk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ee1c53b-6221-42a1-a8a4-f51e23b5f351_1200x742.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 2: </strong>Historic land speed records in mph over time. Speeds on the left are an average of the record set in mph over 1 km and over 1 mile. The red dot represents the first record in a cluster that was from a jet propelled vehicle.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> The trendline shown is fit to the data from 1920-1940 and extrapolated.</p></blockquote><p>This data shows several pauses. The most interesting one for this blog post is from 1939 to 1963. The first pause had similar growth before and after the pause, with no catch-up growth, and the last pause has less data to establish a clear trendline after the pause.</p><p>In 1939, progress in the land speed record abruptly stopped. No new records were set until 1947, and significant progress on this metric did not resume until 1963. Then there was extremely rapid progress over the next two years. This looks like catch-up growth, although it did not quite reach the previous trendline. It also was not sustained, and the catch-up growth was followed by another pause.</p><p>The explanation for this pattern follows what we would expect for overhang. The important input is the engine: having a more powerful engine is extremely important for making a car go fast. The exogenous source of the pause was WWII, which caused industrial capacity to shift towards more strategically important technologies. During WWII, engine progress continued. In particular, jet engines were developed for aircraft. Jet engines were initially reserved for military aircraft, and were not widely available for private buyers until after 1958, when the Boeing 707 entered service. In 1963, the land speed record was broken by a car with a jet engine for the first time, which began a period of rapid progress.</p><p>One other consideration for the land speed record is that it is not strategically or economically important, especially after WWII, so relatively small amounts of resources have been devoted to it. The most expensive record-breaking land vehicles cost millions or tens of millions of dollars.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> If this were a more important technology, someone might have built a car with a jet engine earlier, leading to more continuous growth.</p><p>This is the only example I found which had an exogenous pause, and then faster growth after the pause.</p><h3>Endogenous Pauses: Optical Cables and Particle Accelerators</h3><p>The data I found that looks most like Figure 1 involves optical cables, and is shown in Figure 3. There is a clear trend, followed by a pause, followed by rapid growth that catches up to the previous trendline. The trend continues for another decade beyond the edge of this graph, through generations 5 &amp; 6.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bMDn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bMDn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 424w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 848w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 1272w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bMDn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp" width="400" height="251" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/539af9c9-51a9-4407-8405-449a5d746703_400x251.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:251,&quot;width&quot;:400,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:11742,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bMDn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 424w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 848w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 1272w, https://substackcdn.com/image/fetch/$s_!bMDn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F539af9c9-51a9-4407-8405-449a5d746703_400x251.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 3: </strong>Bandwidth-distance product in fiber optics alone,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> from Agrawal, 2016.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a></p></blockquote><p>This figure comes from a book chapter (Agrawal, 2016) which also explains the history. The cause of the pause was endogenous: it was determined by technical details of the technology itself, rather than something external. The original trendline is actually a combination of S-curves, each of which corresponds to a significant change in the design of the cables, called a &#8216;generation&#8217; of the technology. The transition between most generations required a single significant change. The transition between generations 3 &amp; 4 required three simultaneous changes to work. This explains why the transition occurred later, and resulted in more rapid growth, than the other transitions.</p><p>Particle accelerators have followed a similar pattern as optical cables, as shown in Figure 4. There are periods of slower growth, followed by periods of rapid growth. The slower growth was caused by diminishing returns to existing designs, and the faster growth was caused by new accelerator designs.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> The result is again a combination of S-curves. This pattern can be entirely explained by factors internal to the technology, rather than anything exogenous.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lR3s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lR3s!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 424w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 848w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 1272w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lR3s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png" width="1024" height="768" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/be1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:768,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:62946,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!lR3s!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 424w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 848w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 1272w, https://substackcdn.com/image/fetch/$s_!lR3s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1f719a-fe2b-42f3-978a-630699d82c0f_1024x768.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 4:</strong> Record center of mass energy for a particle-particle collision in a particle accelerator from 1920 to 2010. Note the three distinct S-curves from about 1930-1945, 1945-1970, and 1970-2010.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a></p></blockquote><p>While this is an interesting way that a technology can develop, it is not a precedent for hardware overhang developing as a result of AI policy.</p><h3>Prohibition of Alcohol</h3><p>One example of a technology which was banned, and for which the ban was then completely reversed, is the Prohibition of alcohol in the US from 1920-1933.&nbsp;</p><p>The real cost of producing alcohol has fallen over time, as expected for a manufactured good.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a> During Prohibition, general industrial practices in the US improved substantially, and alcohol continued to be produced at scale in other countries, likely resulting in continued progress there. This seems like a situation where overhang might develop.</p><p>Unfortunately, I have not been able to find data on the cost of producing alcoholic beverages<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a> before and after Prohibition. The way economic data are reported changed significantly during the early 1900s, so it is difficult to compare data before<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> to data after.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> There are data and proxies for the consumption of alcohol before, during, and after Prohibition, shown in Figure 5, which show lower consumption after Prohibition than before. This does not suggest a significant reduction in the cost of alcohol immediately after Prohibition ended.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gEPm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gEPm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 424w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 848w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 1272w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gEPm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png" width="790" height="536" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce786ff2-9c43-4652-b76e-788003c598a3_790x536.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:536,&quot;width&quot;:790,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gEPm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 424w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 848w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 1272w, https://substackcdn.com/image/fetch/$s_!gEPm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce786ff2-9c43-4652-b76e-788003c598a3_790x536.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 5: </strong>Alcohol consumption before and after Prohibition (squares, thicker line) and several proxies for alcohol consumption during Prohibition. Plot from Miron &amp; Zwiebel, 1991.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a></p></blockquote><h3>Other Examples?</h3><p>These are the best examples of overhang I have been able to find. None of them match the behavior that people seem to expect will happen with hardware overhang.</p><p>Most of the data series seem even less like overhang than these. Many do not have clear trendlines, which makes them hard to interpret. Many have pauses followed by similar or slower growth. When there is a discontinuity in technological progress (which is uncommon, but exists), it is more likely to be preceded by accelerating growth than by a pause.</p><p>If you are aware of any technology where you think overhang might have occurred, please let me know in the comments below.</p><h2>Why Are There No Good Examples Of Overhang?</h2><p><em>This section is more speculative than the previous section.&nbsp;</em></p><p>The lack of examples of overhang for other technologies makes me skeptical that this would happen for AI. Catch-up growth for a particular technology after a pause is at least uncommon.</p><p>Here are some possible explanations I have considered for the observed lack of overhang. Feel free to suggest more in the comments.</p><h3>There&#8217;s Nothing So Permanent as a Temporary Government Program</h3><p>One premise of overhang is that a ban on progress for a particular technology would be enacted for a certain amount of time, and then completely ended. This is not how regulations normally work.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a></p><p>There are lots of things which are regulated, many of which require complicated standards or enforcement mechanisms.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a> It is easier to enact a regulation than it is to repeal a regulation. This is known as the &#8216;regulatory ratchet.&#8217;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a> Legislatures and regulatory agencies have a mandate to pass new laws or regulations in response to problems. While they also have the power to repeal regulation, there is typically less incentive for them to work on repeals. Sunset laws might make this more symmetric, but they are uncommon.</p><p>Even if a ban is initially temporary, it is likely to be extended. The political coalition which was successful at getting the temporary ban enacted will still exist as the ban nears its end date. They were powerful enough to win before, and now they have status quo bias on their side. The industry which opposes the ban likely has less money and influence as a result of the ban. We should expect most new regulation to be permanent, unless its consequences are so visibly bad<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-21" href="#footnote-21" target="_self">21</a> that it empowers a reaction much stronger than the initial movement. Even when a political party makes repealing a particular piece of legislation a central plank of their platform, and then wins major electoral victories, it is still hard to get the legislation repealed.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-22" href="#footnote-22" target="_self">22</a></p><p>When a ban is reversed, it is unlikely to be fully reversed. Bans which are temporary are sometimes enacted to give the relevant community time to figure out what good policies are. The ban ends when these policies are enacted, but the new policies often maintain some restrictions from the previous ban. The Asilomar Conference on Recombinant DNA follows this pattern. In 1974, the National Academy of Sciences (NAS) enacted a temporary ban<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-23" href="#footnote-23" target="_self">23</a> on all experiments involving recombinant DNA, in response to concerns about biohazardous experiments. The Asilomar Conference in 1975 prohibited certain kinds of experiments, and provided recommendations for other experiments. After Asilomar, the total ban was reversed, and replaced with these strict guidelines.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-24" href="#footnote-24" target="_self">24</a> Some of these guidelines have since been scaled back, but regulation on recombinant DNA remains stricter than it was before the temporary ban.</p><h3>Dispelling the Hype</h3><p>Emerging technologies depend on hype. They are often not currently profitable, so they rely on promises of future success to acquire the resources they need. Without the hype, startups would fail and established companies would shift their investments elsewhere.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-25" href="#footnote-25" target="_self">25</a></p><p>Increasing regulations or bans are effective ways to dispel hype, even if they are enacted temporarily. People would (reasonably) expect that they might not be temporary. The ban is a strong signal to the private sector and foreign countries that this technology is not going to be the next big thing and reduces their expected value for pursuing it. The hype seems unlikely to fully return after a ban is lifted.</p><p>There are two main resources driven by hype: venture capital and talent.</p><p>Venture capital is the most obvious resource fueled by hype. Investors know that startups are promising more than they can currently deliver, but believe that there is a good enough chance that they will be able to deliver in the future that they are worth investing in. If something happens that makes investors think that startups in a field will systematically not be able to deliver, that available capital can quickly shift elsewhere.</p><p>Talent responds more slowly than capital. If there is a ban that is expected to last more than a few years, then students who are choosing their careers are less likely to want to go into that field, and people who currently work in the field are more likely to consider switching careers. While venture capital might quickly return after a ban is lifted, re-recruiting talent that has gone elsewhere would take longer.</p><h3>Reduced Demand for the Input</h3><p>Overhang assumes that progress for the input will continue to develop during the pause, the same as if the pause did not occur.&nbsp;</p><p>It is not obvious if this would be the case for AI. AI is one of the major markets for GPUs and is the sole market for TPUs. We should expect counterfactually less progress in GPUs &amp; TPUs as a result of AI regulation.</p><p>It might be different if the technology which was paused accounts for a small fraction of the total market of the input. Extremely fast cars account for effectively none of the market for jet engines.</p><p>For technologies which account for a larger fraction of the demand for their inputs, progress in their inputs is less likely to continue at the same rate if they are paused. They would have less catch-up growth immediately available afterwards.</p><h2>Conclusion</h2><p>I do not think that technologies typically develop overhang if they are subject to an exogenous pause. I know of no good examples of overhang from other technologies.</p><p>There are several plausible reasons why overhang is uncommon. Most regulations are not temporary, most &#8216;temporary&#8217; regulations or bans are extended, and when bans are ended, this is often accompanied by stricter regulation than there was before. Temporary bans can dispel the hype surrounding an emerging technology causing venture capital and (more slowly) talent to leave the field. The assumption that progress in the input continues the same as before if one of its major customers gets paused also seems suspect.</p><p>As an example of how people do not expect overhang elsewhere, consider nuclear power. Nuclear power was getting steadily cheaper in the 1960s. Then, increasing safety regulation applied to nuclear power caused the learning curve to reverse, and the cost of building new nuclear reactors increased dramatically.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-26" href="#footnote-26" target="_self">26</a> Nuclear power advocates argue that, if we reduced regulations on nuclear power to a more reasonable level, we could see prices drop by a factor of 10 to return to the construction costs of the 1960s. Nuclear power advocates do not extrapolate the trendline from the 1960s to today and predict that, if this regulation were repealed, nuclear power would suddenly become 10<sup>6</sup> times cheaper than it is today, which is what we would expect if the overhang model applied. Extrapolating all the way to today is probably unreasonable: there&#8217;s a good chance that nuclear power would not have followed the trendline for that long anyway.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-27" href="#footnote-27" target="_self">27</a> Nuclear power advocates&#8217; optimistic views are less optimistic than what they would expect if they predicted even partial overhang.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3lQ1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3lQ1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 424w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 848w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 1272w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3lQ1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png" width="1276" height="742" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:742,&quot;width&quot;:1276,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!3lQ1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 424w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 848w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 1272w, https://substackcdn.com/image/fetch/$s_!3lQ1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0cb6c6bb-36c3-4b23-b326-5f1246e9c18c_1276x742.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p><strong>Figure 6: </strong>Trendline for nuclear power plant construction costs in the US. Data are from Lovering et al.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-28" href="#footnote-28" target="_self">28</a> and digitized using WebPlotDigitizer.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-29" href="#footnote-29" target="_self">29</a> The trendline is fit to data from 1963-1966 and extrapolated to the year 2000. Extrapolating to 2023 results in two orders of magnitude of further cost reduction.</p></blockquote><p>I think that hardware overhang can still be worth considering as something bad and plausible, but we should not treat overhang as the default or most likely future if there is a temporary pause in AI development. It seems more likely that AI progress would resume at a similar or slower rate after the ban is lifted, if it is lifted at all.</p><p><em>Thanks to Rick Korzekwa, Harlan Stewart, Zach Stein-Perlman, Rocket Drew, McKenna Fitzgerald, Ryan Kidd, Curt Tigges, and Alex Gray for helpful discussions on this topic.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Zach Stein-Perlman. <em>Cruxes for overhang. </em>AI Impacts Blog. (2023) <a href="https://blog.aiimpacts.org/p/cruxes-for-overhang">https://blog.aiimpacts.org/p/cruxes-for-overhang</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>This blog post does not consider how this would be implemented. The focus is on what the effects would be if it were.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>I am using &#8216;discontinuity&#8217; in the sense of our discontinuous progress investigation:</p><p><em>Discontinuous progress investigation. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/ai_timelines/discontinuous_progress_investigation">https://wiki.aiimpacts.org/ai_timelines/discontinuous_progress_investigation</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>This is not that uncommon. There are examples of progress for a particular technology stopping:</p><p><em>Examples of Progress for a Particular Technology Stopping. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping">https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p><em>Discontinuous progress investigation. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/ai_timelines/discontinuous_progress_investigation">https://wiki.aiimpacts.org/ai_timelines/discontinuous_progress_investigation</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p><em>Performance Curve Database. </em>Santa Fe Institute. (Accessed December 6, 2023.) <a href="https://pcdb.santafe.edu">https://pcdb.santafe.edu</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p><em>Historic trends in land speed records. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/doku.php?id=takeoff_speed:continuity_of_progress:historic_trends_in_land_speed_records">https://wiki.aiimpacts.org/doku.php?id=takeoff_speed:continuity_of_progress:historic_trends_in_land_speed_records</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Dominik Wilde. <em>Bloodhound Land Speed Record Project Enters Administration.</em> Motor1 (2018) <a href="https://www.motor1.com/news/270039/bloodhound-land-speed-administration/">https://www.motor1.com/news/270039/bloodhound-land-speed-administration/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p><em>Historic trends in telecommunications performance. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/takeoff_speed/continuity_of_progress/historic_trends_in_telecommunications_performance?s[]=fiber&amp;s[]=optics#easy-footnote-bottom-12-1368">https://wiki.aiimpacts.org/takeoff_speed/continuity_of_progress/historic_trends_in_telecommunications_performance?s[]=fiber&amp;s[]=optics#easy-footnote-bottom-12-1368</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Govind P. Agrawal. <em>Optics in Our Time. </em>Ch. 8: Optical Communication: Its History and Recent Progress. Figure 8.8. (2016) p. 177-199. <a href="https://link.springer.com/chapter/10.1007/978-3-319-31903-2_8">https://link.springer.com/chapter/10.1007/978-3-319-31903-2_8</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p><em>Short History of Particle Accelerators. </em>CERN. (2006) p. 45. <a href="https://cas.web.cern.ch/sites/default/files/lectures/zakopane-2006/tazzari-history.pdf">https://cas.web.cern.ch/sites/default/files/lectures/zakopane-2006/tazzari-history.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p><em>Historic trends in particle accelerator performance. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/takeoff_speed/continuity_of_progress/historic_trends_in_particle_accelerator_performance">https://wiki.aiimpacts.org/takeoff_speed/continuity_of_progress/historic_trends_in_particle_accelerator_performance</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>The cost of alcohol at home has increased by 44% from 2000-2023, compared to a 79% increase in costs overall.</p><p>Stefan Skyes. <em>Here&#8217;s how the price of your beer has changed over time. </em>CNBC. (2023) <a href="https://www.cnbc.com/2023/06/02/here-is-why-beer-prices-are-going-up-according-to-our-data.html">https://www.cnbc.com/2023/06/02/here-is-why-beer-prices-are-going-up-according-to-our-data.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>I have found several data series on ethyl alcohol covering this time frame, but this was not banned:</p><p>FRED. <em>Price of Ethyl Alcohol, Grain for New York. </em>St. Louis Fed. (Accessed December 5, 2023.) <a href="https://fred.stlouisfed.org/series/M04177US000NYM054NNBR">https://fred.stlouisfed.org/series/M04177US000NYM054NNBR</a>.</p><p>FRED. <em>Ethyl Alcohol Production for United States. </em>St. Louis Fed. (Accessed December 5, 2023.) <a href="https://fred.stlouisfed.org/series/M01226USM441NNBR">https://fred.stlouisfed.org/series/M01226USM441NNBR</a>.</p><p>FRED. <em>Ethyl Alcohol Stocks, at Warehouses for United States. </em>St. Louis Fed. (Accessed December 5, 2023.) <a href="https://fred.stlouisfed.org/series/M05043USM441NNBR">https://fred.stlouisfed.org/series/M05043USM441NNBR</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>Data before Prohibition tend to involve prices for particular goods, found by having government officials ask wholesale suppliers what they pay to purchase them. See, for example:&nbsp;</p><p>Carroll D. Wright. <em>Comparative wages, prices, and cost of living. </em>Sixteenth annual report of the Massachusetts Bureau of Statistics of Labor. (1885) p. 126-129. <a href="https://babel.hathitrust.org/cgi/pt?id=hvd.32044050806330&amp;seq=141">https://babel.hathitrust.org/cgi/pt?id=hvd.32044050806330&amp;seq=141</a>.</p><p>Carroll D. Wright. Bulletin of the Department of Labor <strong>27</strong>. (1900) p. 260. <a href="https://fraser.stlouisfed.org/files/docs/publications/bls/bls_v05_0027_1900.pdf?utm_source=direct_download">https://fraser.stlouisfed.org/files/docs/publications/bls/bls_v05_0027_1900.pdf?utm_source=direct_download</a>.</p><p><em>Wholesale Prices 1890 to 1913. </em>Bulletin of the United States Bureau of Labor Statistics <strong>149</strong>. (1914) p. 164. <a href="https://fraser.stlouisfed.org/title/wholesale-prices-160/wholesale-prices-1890-1913-497569">https://fraser.stlouisfed.org/title/wholesale-prices-160/wholesale-prices-1890-1913-497569</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Data after Prohibition tend to involve total personal expenditures on alcohol or total alcohol produced by the country as a whole. See, for example:</p><p>FRED. <em>Personal consumption expenditures: Nondurable goods: Alcoholic beverages purchased for off-premises consumption. </em>St. Louis Fed. (Accessed December 6, 2023.) <a href="https://fred.stlouisfed.org/series/DAOPRC1A027NBEA">https://fred.stlouisfed.org/series/DAOPRC1A027NBEA</a>.</p><p>FRED. <em>Personal consumption expenditures: Nondurable goods: Alcoholic beverages purchased for off-premises consumption (chain-type price index). </em>St. Louis Fed. (Accessed December 6, 2023.) <a href="https://fred.stlouisfed.org/series/DAOPRG3A086NBEA">https://fred.stlouisfed.org/series/DAOPRG3A086NBEA</a>.</p><p>FRED. <em>Personal consumption expenditures: Nondurable goods: Alcoholic beverages purchased for off-premises consumption (chain-type quantity index). </em>St. Louis Fed. (Accessed December 6, 2023.) <a href="https://fred.stlouisfed.org/series/DAOPRA3A086NBEA">https://fred.stlouisfed.org/series/DAOPRA3A086NBEA</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>Jeffrey A. Miron and Jeffrey Zwiebel. <em>Alcohol Consumption During Prohibition. </em>NBER Working Papers Series <strong>3675</strong>. (1991) <a href="https://www.nber.org/papers/w3675">https://www.nber.org/papers/w3675</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p>A recent preprint looked at the persistence of policies decided by small margins in state referendums. It found that the likelihood that a policy remains in effect initially declines from 100% to about 40% over a few decades, then plateaus. Even a century later, 40% of the closely decided referenda remained in force, in some cases despite becoming less popular with the general public.</p><p>Zach Freitas-Groff. <em>Persistence in Policy: Evidence from Close Votes. </em>(2023)&nbsp; <a href="https://zachfreitasgroff.b-cdn.net/FreitasGroff_Policy_Persistence.pdf">https://zachfreitasgroff.b-cdn.net/FreitasGroff_Policy_Persistence.pdf</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p><em>Examples of Regulated Things. </em>AI Impacts Wiki. (Accessed December 11, 2023). <a href="https://wiki.aiimpacts.org/responses_to_ai/examples_of_regulated_things">https://wiki.aiimpacts.org/responses_to_ai/examples_of_regulated_things</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p>Mark R. Lee. <em>The Regulatory Ratchet: Why Regulation Begets Regulation. </em>University of Cincinnati Law Review <strong>87.3</strong>. (2019) <a href="https://scholarship.law.uc.edu/cgi/viewcontent.cgi?article=1286&amp;context=uclr">https://scholarship.law.uc.edu/cgi/viewcontent.cgi?article=1286&amp;context=uclr</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-21" href="#footnote-anchor-21" class="footnote-number" contenteditable="false" target="_self">21</a><div class="footnote-content"><p>Note that regulation does not just have to be bad: it has to be very visibly bad. Very impactful legislation does not have to be visible, if it is felt through slower than counterfactual economic growth, for example.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-22" href="#footnote-anchor-22" class="footnote-number" contenteditable="false" target="_self">22</a><div class="footnote-content"><p>Examples include Democrats&#8217; opposition to the Bush tax cuts and Republicans&#8217; opposition to Obamacare. The Bush tax cuts were set to expire after 10 years, but in 2013 they were made permanent for individuals making less than $400,000 per year. Republicans ran opposing Obamacare for years, but once they were in full control of the government, they were unable to repeal it. The biggest change they were able to make was setting the penalty of the individual mandate to zero in 2019. Lower profile legislation is even less likely to be repealed.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-23" href="#footnote-anchor-23" class="footnote-number" contenteditable="false" target="_self">23</a><div class="footnote-content"><p>The NAS is a nonprofit organization, which does not have the power to enforce its policies. It was influential enough that their ban was followed, and it often advises regulatory agencies&#8217; policy decisions.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-24" href="#footnote-anchor-24" class="footnote-number" contenteditable="false" target="_self">24</a><div class="footnote-content"><p>Paul Berg. <em>Asilomar 1975: DNA modification secured. </em>Nature <strong>455</strong>. (2008) p. 290-291. <a href="https://www.nature.com/articles/455290a">https://www.nature.com/articles/455290a</a>.</p><p>While the NAS does not enforce the guidelines itself, the NIH requires them in order to receive funding.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-25" href="#footnote-anchor-25" class="footnote-number" contenteditable="false" target="_self">25</a><div class="footnote-content"><p>My not particularly well informed model of the dot com bubble and crash is that it was primarily driven by hype. The internet was the next big thing, so resources poured into it - more so than could be productively used. Once people realized that this was a bubble, the hype dissipated and the market crashed. This is an example of hype being dispelled private, while this blog post discusses government action which dispels the hype.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-26" href="#footnote-anchor-26" class="footnote-number" contenteditable="false" target="_self">26</a><div class="footnote-content"><p>Jason Crawford. <em>Why Nuclear Power Has Been A Flop. </em>Roots of Progress. (2021) <a href="https://rootsofprogress.org/devanney-on-the-nuclear-flop">https://rootsofprogress.org/devanney-on-the-nuclear-flop</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-27" href="#footnote-anchor-27" class="footnote-number" contenteditable="false" target="_self">27</a><div class="footnote-content"><p><em>Examples of Progress for a Particular Technology Stopping. </em>AI Impacts Wiki. (Accessed December 5, 2023.) <a href="https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping">https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-28" href="#footnote-anchor-28" class="footnote-number" contenteditable="false" target="_self">28</a><div class="footnote-content"><p>Lovering, Yip, &amp; Nordhaus. <em>Historical construction costs of global nuclear power reactors. </em>Figure 2. Energy Policy <strong>91</strong>. (2016) p. 371-382. <a href="https://www.sciencedirect.com/science/article/pii/S0301421516300106">https://www.sciencedirect.com/science/article/pii/S0301421516300106</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-29" href="#footnote-anchor-29" class="footnote-number" contenteditable="false" target="_self">29</a><div class="footnote-content"><p>Ankit Rohatgi. WebPlotDigitizer <strong>4.6</strong>. (Accessed December 4, 2023) <a href="https://apps.automeris.io/wpd/">https://apps.automeris.io/wpd/</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[AI Impacts Quarterlyish Newsletter, Jul-Oct 2023]]></title><description><![CDATA[Every quarter, we have a newsletter with updates on what&#8217;s happening at AI Impacts, with an emphasis on what we&#8217;ve been working on.]]></description><link>https://blog.aiimpacts.org/p/ai-impacts-quarterlyish-newsletter</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/ai-impacts-quarterlyish-newsletter</guid><dc:creator><![CDATA[Harlan Stewart]]></dc:creator><pubDate>Wed, 15 Nov 2023 20:15:33 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/690dfccc-21eb-4bf9-afb0-0f1192561f63_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Every quarter, we have a newsletter with updates on what&#8217;s happening at AI Impacts, with an emphasis on what we&#8217;ve been working on. You can see past newsletters <a href="https://blog.aiimpacts.org/t/newsletter">here</a> and subscribe to receive more newsletters and other blogposts <a href="https://blog.aiimpacts.org/subscribe">here</a>.</em></p><p>Since our <a href="https://blog.aiimpacts.org/p/ai-impacts-quarterly-newsletter-apr">last newsletter</a>, we <a href="https://blog.aiimpacts.org/p/new-report-a-review-of-the-empirical">reviewed some empirical evidence for AI risk</a>, completed two <a href="http://aiimpacts.org/wp-content/uploads/2023/10/Eco-labels-case-study-website-version.pdf">case</a> <a href="https://aiimpacts.org/wp-content/uploads/2023/09/Standards-Case-Study-Institutional-Review-Boards.pdf">studies</a> about standards, and wrote a few other blog posts and wiki pages. Recently we have been busy with ongoing projects, including the <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai">2023 Expert Survey on Progress in AI</a>, which we will share the results of soon.</p><h1>Research and writing highlights</h1><h2>Case studies about standards</h2><ul><li><p>In response to <a href="https://forum.effectivealtruism.org/posts/idrBxfsHkYeTtpm2q/seeking-paid-case-studies-on-standards">Holden Karnofsky&#8217;s call for case studies</a> on standards that are interestingly analogous to AI safety standards, Harlan and Jeffrey completed case studies for two examples of social-welfare-based standards:&nbsp;</p><ul><li><p>Jeffrey wrote <a href="https://aiimpacts.org/wp-content/uploads/2023/09/Standards-Case-Study-Institutional-Review-Boards.pdf">a report about institutional review boards</a> for medical research. IRBs were formed after some unethical medical experiments were publicized. The government does not regulate medical research itself, but instead requires institutions that do medical research to regulate it in compliance with federal rules. The principles of medical ethics developed in the US spread widely across the world with little effort from the US government.</p></li><li><p>Harlan wrote <a href="https://aiimpacts.org/wp-content/uploads/2023/10/Eco-labels-case-study-website-version.pdf">a report about Green Seal and SCS,</a> the first eco-labeling programs in the US. These programs failed to transform the consumer market as they initially intended. However, by establishing themselves as experts at the right time, they may have influenced the purchasing behavior of institutions, as well as the creation of other eco-labeling programs.</p></li></ul></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!umq3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!umq3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 424w, https://substackcdn.com/image/fetch/$s_!umq3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 848w, https://substackcdn.com/image/fetch/$s_!umq3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!umq3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!umq3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg" width="507" height="314" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:314,&quot;width&quot;:507,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:18871,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!umq3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 424w, https://substackcdn.com/image/fetch/$s_!umq3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 848w, https://substackcdn.com/image/fetch/$s_!umq3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!umq3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20958afb-3299-4e88-baab-972b74f9380a_507x314.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>The adoption of standards created by Green Seal seems to be closely tied to the procurement policies of the US Government. Joe Biden&#8217;s recent <a href="https://www.whitehouse.gov/briefing-room/statements-releases/2023/10/30/fact-sheet-president-biden-issues-executive-order-on-safe-secure-and-trustworthy-artificial-intelligence/">executive order</a> calls for, among other things, &#8220;guidance for agencies&#8217; use of AI, including clear standards to&#8230; improve AI procurement..&#8221;</em></figcaption></figure></div><h2>Empirical evidence for misalignment and power-seeking in AI</h2><ul><li><p>Rose Hadshar investigated the evidence for future AI being misaligned, and for it being power-seeking, focusing on empirical evidence. Her project had several outputs:</p><ul><li><p>A <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/database_of_empirical_evidence_about_ai_risk">database</a> of empirical examples of evidence for different AI behaviors, such as power-seeking, goal misspecification, and deception</p></li><li><p>A <a href="https://arxiv.org/abs/2310.18244">report</a> reviewing some evidence for existential risk from AI, focused on empirical evidence for misalignment and power-seeking</p></li><li><p>A <a href="https://blog.aiimpacts.org/p/a-mapping-of-claims-about-ai-risk">blogpost</a> outlining some of the key claims that are often made in support of the argument that AI poses an existential threat</p></li><li><p>A series of <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/interviews_on_the_strength_of_the_evidence_for_ai_risk_claims">interviews</a> of AI researchers about their views on the strength of the available evidence for risks from AI.</p></li></ul></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kgC1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kgC1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kgC1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png" width="1456" height="840" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:840,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:939245,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kgC1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kgC1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa31e3dc7-f30d-459f-8793-d33b88e1fd34_1774x1024.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>A new <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/database_of_empirical_evidence_about_ai_risk">database</a> includes empirical examples of evidence for different AI behaviors, such as power-seeking, goal misspecification, and deception.</em></figcaption></figure></div><h2>Feasibility of strong AI regulation</h2><ul><li><p>To better understand how proposals to regulate AI hardware might work, Rick and Jeffrey compiled a <a href="https://wiki.aiimpacts.org/responses_to_ai/examples_of_regulated_things">list of existing product regulations</a> that limit the amount of a product people can own or require licensure or registration. Products on the list include fireworks, chickens, and uranium.</p></li><li><p>Jeffrey wrote a <a href="https://blog.aiimpacts.org/p/muddling-along-is-more-likely-than">blogpost</a> explaining why he believes that strict regulation of AI development is plausible without disrupting progress in other areas of society.</p></li></ul><h2>Miscellany</h2><ul><li><p>Zach wrote <a href="https://blog.aiimpacts.org/p/cruxes-for-overhang">Cruxes for overhang</a>, a blogpost identifying some of the crucial considerations for the possibility that slowing AI progress now could lead to faster progress later.</p></li><li><p>Jeffrey compiled a <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/list_of_possible_risks_from_ai">list of possible risks from AI</a>.</p></li><li><p>Zach wrote a blogpost summarizing some takeaways about <a href="https://blog.aiimpacts.org/p/us-public-opinion-on-ai-september">US public opinion on AI</a>, based on collected survey data.</p></li><li><p>Jeffrey compiled a <a href="https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping">list of examples of progress for a particular technology stopping</a>, showing that a particular technology can have periods of both progress and stagnation.</p></li></ul><h1>Ongoing projects</h1><ul><li><p>Katja, Rick, and Harlan worked with outside collaborators on the <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai">2023 Expert Survey on Progress in AI</a>. We are looking forward to sharing the details and results soon.</p></li><li><p>Zach is working on a project on best practices for frontier AI labs to develop and deploy AI safely.</p></li></ul><h1>Recent references to AI Impacts research</h1><ul><li><p>Data from the <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2016_expert_survey_on_progress_in_ai">2016</a> and <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2022_expert_survey_on_progress_in_ai">2022</a> Expert Surveys on Progress in AI was referenced in the UK government&#8217;s discussion paper <a href="https://www.gov.uk/government/publications/frontier-ai-capabilities-and-risks-discussion-paper">Frontier AI: capabilities and risks</a>, an <a href="https://aitreaty.org/">open letter</a> advocating for an AI treaty, an article from the Telegraph titled <a href="https://www.telegraph.co.uk/business/2023/09/23/artificial-intelligence-safety-summit-sunak-ai-experts/">&#8216;This is his climate change&#8217;: The experts helping Rishi Sunak seal his legacy</a>, and a paper published by the Centre for the Governance of AI titled <a href="https://www.governance.ai/research-paper/risk-assessment-at-agi-companies-a-review-of-popular-risk-assessment-techniques-from-other-safety-critical-industries">Risk Assessment at AGI Companies: A Review of Popular Risk Assessment Techniques From Other Safety-Critical Industries</a></p></li><li><p>Katja was quoted in a recent <a href="https://sd11.senate.ca.gov/news/20230913-senator-wiener-introduces-safety-framework-artificial-intelligence-legislation">press release</a> from California State Senator Scott Wiener, as well as an article in Vox titled <a href="https://www.vox.com/future-perfect/2023/7/7/23787011/ai-arms-race-tragedy-commons-risk-safety">AI is a &#8220;tragedy of the commons.&#8221; We&#8217;ve got solutions for that</a></p></li></ul><h1>Funding</h1><p>Thank you to our recent funders! Including Jaan Tallinn, who gave us a <a href="https://survivalandflourishing.fund/sff-2023-h2-recommendations">$179k grant</a> through the Survival and Flourishing Fund, Future of Life Institute, who gave us a <a href="https://survivalandflourishing.fund/sff-2023-h2-recommendations">$162k grant</a> through the Survival and Flourishing Fund, and Open Philanthropy who gave us a <a href="https://www.openphilanthropy.org/grants/ai-impacts-expert-survey-on-progress-in-ai/">$150k grant</a> in support of the <a href="https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai">2023 Expert Survey on Progress in AI</a>.<br><br>We are still seeking funding for 2024. If you want to talk to us about why we should be funded or hear more details about our plans, please write to Rick or Katja at [firstname]@aiimpacts.org. If you'd like to donate to AI Impacts, you can do so <a href="https://aiimpacts.org/donate/">here</a>. (And we thank you!)<br><br><em>Image credit: DALL-E 3</em><br></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[New report: A review of the empirical evidence for existential risk from AI via misaligned power-seeking]]></title><description><![CDATA[Visiting researcher Rose Hadshar recently published a review of some evidence for existential risk from AI, focused on empirical evidence for misalignment and power seeking. (Previously from this project: a blogpost outlining some of the key claims that are often made about AI risk]]></description><link>https://blog.aiimpacts.org/p/new-report-a-review-of-the-empirical</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/new-report-a-review-of-the-empirical</guid><dc:creator><![CDATA[Harlan Stewart]]></dc:creator><pubDate>Mon, 06 Nov 2023 23:23:46 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/00d1ee92-5d1c-439e-8526-b82ddda58958_349x308.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Visiting researcher Rose Hadshar recently published <a href="https://arxiv.org/pdf/2310.18244.pdf">a review of some evidence for existential risk from AI, focused on empirical evidence for misalignment and power seeking</a>. (Previously from this project: a blogpost outlining some of the <a href="https://blog.aiimpacts.org/p/a-mapping-of-claims-about-ai-risk">key claims that are often made about AI risk</a>, a series of <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/interviews_on_the_strength_of_the_evidence_for_ai_risk_claims">interviews</a> of AI researchers, and a <a href="https://wiki.aiimpacts.org/arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/database_of_empirical_evidence_about_ai_risk">database</a> of empirical evidence for misalignment and power seeking.)</p><p>In this report, Rose looks into evidence for:</p><ul><li><p>Misalignment,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> where AI systems develop goals which are misaligned with human goals; and&nbsp;</p></li><li><p>Power-seeking,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> where misaligned AI systems seek power to achieve their goals.</p></li></ul><p>Rose found the current state of this evidence for existential risk from misaligned power-seeking to be concerning but inconclusive:</p><ul><li><p>There is empirical evidence of AI systems developing misaligned goals (via specification gaming<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> and via goal misgeneralization<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a>), including in deployment (via specification gaming), but it's not clear to Rose whether these problems will scale far enough to pose an existential risk.</p></li><li><p>Rose considers the conceptual arguments for power-seeking behavior from AI systems to be strong, but notes that she could not find any clear examples of power-seeking AI so far.</p></li></ul><p>With these considerations, Rose thinks that it&#8217;s hard to be very confident either that misaligned power-seeking poses a large existential risk, or that it poses no existential risk. She finds this uncertainty to be concerning, given the severity of the potential risks in question. Rose also expressed that it would be good to have more reviews of evidence, including evidence for other claims about AI risks<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> and evidence against AI risks.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.aiimpacts.org/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>&#8220;An AI is misaligned whenever it chooses behaviors based on a reward function that is different from the true welfare of relevant humans.&#8221; (<a href="https://dl.acm.org/doi/pdf/10.1145/3306618.3314250">Hadfield-Menell &amp; Hadfield</a>, 2019)</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Rose follows (Carlsmith, 2022) and defines power-seeking as &#8220;active efforts by an AI system to gain and maintain power in ways that designers didn&#8217;t intend, arising from problems with that system&#8217;s objectives."</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>"Specification gaming is a behaviour that satisfies the literal specification of an objective without achieving the intended outcome." (<a href="https://www.deepmind.com/blog/specification-gaming-the-flip-side-of-ai-ingenuity">Krakovna et al.</a>, 2020).</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>"Goal misgeneralization is a specific form of robustness failure for learning algorithms in which the learned program competently pursues an undesired goal that leads to good performance in training situations but bad performance in novel test situations." (<a href="https://arxiv.org/abs/2210.01790%5C">Shah et al.</a>, 2022a).</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Joseph Carlsmith&#8217;s report <a href="https://arxiv.org/abs/2206.13353">Is Power-Seeking AI an Existential Risk?</a> Reviews some evidence for most of the claims that are central to the argument that AI will pose an existential risk.</p><p></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Last year, Katja wrote <a href="https://blog.aiimpacts.org/p/counterarguments-to-the-basic-ai-x-risk-case">Counterarguments to the basic AI x-risk case</a>, which outlines some arguments against existential risk from AI.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Muddling Along Is More Likely Than Dystopia]]></title><description><![CDATA[Summary: There are historical precedents where bans or crushing regulations stop the progress of technology in one industry, while progress in the rest of society continues.]]></description><link>https://blog.aiimpacts.org/p/muddling-along-is-more-likely-than</link><guid isPermaLink="false">https://blog.aiimpacts.org/p/muddling-along-is-more-likely-than</guid><dc:creator><![CDATA[Jeffrey Heninger]]></dc:creator><pubDate>Fri, 20 Oct 2023 21:25:59 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/104eea37-53b6-4883-9d51-2691c44acc2e_1024x756.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>Summary:</strong> There are historical precedents where bans or crushing regulations stop the progress of technology in one industry, while progress in the rest of society continues. This is a plausible future for AI.</p><p><em>Epistemic Status: My intuition strongly disagrees with other people here. I hope to explain my intuition, and provide enough historical evidence to make this intuition at least plausible.</em></p><p></p><h3>Introduction: An Intuition Pump</h3><p>Suppose you told someone in 1978 that no new nuclear power plants would be built in the US until 2023.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> This would probably be very surprising. Nuclear power was supposed to be the power of the future.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> The Nuclear Regulatory Commission had only been created 3 years earlier.</p><p>Given this information, someone in 1978 might predict that something terrible was about to happen. Maybe a nuclear war between the USA and USSR that destroys America&#8217;s industrial capacity. Maybe economic collapse due to overpopulation or global warming. Maybe an Orwellian police state in time for <em>1984</em>, or a World Authority designed to regulate nuclear weapons that got out of hand.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>None of this happened. Instead, the Nuclear Regulatory Commission increased the regulatory ratchet<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> until building new nuclear power plants became uneconomical. These regulations only applied to the USA, but they seem to have significantly impacted nuclear power research globally. Countries that are building new nuclear power plants are still using designs that were developed before 1970.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a></p><p>Regulation on nuclear power probably did slow US economic growth over the next 45 years compared to the counterfactual.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> But the past 45 years have hardly been catastrophic. Economic growth and innovation did continue, driven by other industries.</p><p></p><h3>Consequences of Stopping AGI?</h3><p>Some people involved in the debate about slowing or pausing AI seem to think that successfully stopping AI progress over the long term would likely lead to death or dystopia:</p><blockquote><p>Either we figure out how to make AGI go well or we wait for the asteroid to hit.</p><p> - Sam Altman<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a></p></blockquote><blockquote><p>If we don&#8217;t get AI, I think there&#8217;s a 50%+ chance in the next 100 years we end up dead or careening towards Venezuela. </p><p>- Scott Alexander<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a></p></blockquote><blockquote><p>I think we should be quite worried that the global government needed to enforce such a ban would greatly increase the risk of permanent tyranny, itself an existential catastrophe.</p><p> - Nora Belrose<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a></p></blockquote><blockquote><p>It seems likely that we would need to create a worldwide police state, as otherwise [an indefinite AI pause] would fail in the long run.</p><p> - Matthew Barnett<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a></p></blockquote><p>It feels to me like this is the same sort of mistake that our hypothetical person from 1978 made. It might seem like AI will be an extremely important thing in the future, and so something dramatic would have to happen in order to prevent it. I think that we should put more probability on the boring future where regulation stifles this one field, while the rest of society continues as it had before.</p><p>This seems like an important disagreement. If you think that our descendants&#8217; lives will be pretty good, and getting better if not unimaginably quickly, then stopping AI progress might be worth it for them. If you think that our descendants&#8217; future will be &#8220;short and grim,&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> then they might be less of a consideration when deciding whether to take this risk now.</p><p></p><h3>Specific Concerns</h3><p>Scott Alexander mentions several specific concerns that cause him to be pessimistic about a future without AI progress. Each of them seems like a real problem that people now and in the future should be trying to solve. We should be improving biosecurity,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a> promoting economic growth,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a> spreading democracy &amp; freedom,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a> and giving people hope in future generations. But none of these things seem even close to a 50% chance of causing death or dystopia in the next 100 years. It would not be worth accepting existential risk from AI, which Scott Alexander estimates as having a ~20% chance of causing human extinction, to avoid these.&nbsp;</p><p>Both Nora Belrose and Matthew Barnett are concerned that a global police state would be needed to enforce a long term ban on AI progress. This position does not seem uncommon in the AI safety community. The concerns are that research might shift to locations with fewer regulations, and that algorithmic progress will make AGI possible on a personal computer. The only way to avoid AGI then is a massive expansion of global government power.&nbsp;</p><p></p><h3>Other Historical Examples</h3><p>I do not think that these concerns have been realized with other technologies.&nbsp;</p><p>Regulations in one industry do not stop progress in all other industries. People in the Bay Area likely underestimate the importance of emerging technologies other than AI, or software more generally, because information technology is disproportionately important in the local economy.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> I would similarly expect that people living in Detroit in 1950 would underestimate the importance of emerging technologies other than cars. Lots of progress is still possible without AI. Two emerging technologies I am particularly excited about are fusion and space colonization.</p><p>Regulations in one country can stop progress in a single industry. Progress stopping in a particular industry is not that uncommon.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> Most innovation in a particular industry is done in one or a few cities. These clusters of innovation are difficult to build and maintain, so if one is crushed by regulation, it typically does not just move to another country. On a broader scale, some countries are much more innovative than others. In most industries, including heavily regulated ones, the USA is clearly more innovative than (most of) Europe or East Asia, which are much more innovative than the rest of the world. A lot has to go right: a high standard of living, an educated populace, the rule of law, the possibility of future profit, available capital, and a culture that encourages innovation. Countries which flaunt international regulations or norms typically do not attract innovation. Once a technology exists, it is much easier for other countries to copy it. The designs and skills needed already exist, and the benefits of the technology are clear. Regulation to prevent innovation is much easier than regulation to prevent proliferation.</p><p>I have previously investigated some Resisted Technological Temptations,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a> or technologies where a long term pause has been achieved through our current institutions:</p><ul><li><p><strong>Nuclear power</strong>, discussed above.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a></p></li></ul><ul><li><p><strong>Geoengineering</strong> is not explicitly illegal, but opposition from scientists and activists has prevented even research from being done.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a></p></li></ul><ul><li><p><strong>Vaccine development</strong> is heavily regulated in Western countries. During the recent pandemic, both Russia and China relaxed some of these restrictions and approved vaccines before the West. The resulting vaccines were less effective, because the best medical research is still located in the West. In particular, <strong>human challenge trials</strong> have been regulated into almost non-existence.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a></p></li></ul><ul><li><p><strong>Nuclear weapons</strong> are sometimes mentioned as a technology where regulation has failed to prevent their spread. I think the evidence on this example is mixed. There are about 10 countries which have nuclear weapons (much fewer than the number which could), but only 2 developed them independently: the USA and France. Cutting edge research has not moved to countries which are not party to the Non-Proliferation Treaty. India, Pakistan, and North Korea seem to have similar capabilities as the USA or USSR in the 1940s and 50s.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-21" href="#footnote-21" target="_self">21</a> Testing bans have probably also helped keep nuclear weapons from becoming increasingly powerful: the most powerful bomb ever was detonated in 1961.</p></li></ul><ul><li><p><strong>Biological weapons</strong> have some of the strongest treaties and taboos against their development or use,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-22" href="#footnote-22" target="_self">22</a> and no country openly has a biological weapons program. A problem with this example is that the USSR signed treaties against developing biological weapons - and then continued developing them.</p></li></ul><ul><li><p>Various nuclear technologies, like <strong><s>atomic gardening</s></strong>, <strong>using nuclear explosions in construction</strong>, or <strong>Project Orion</strong>, have been proposed but not developed.</p></li></ul><ul><li><p><strong>Cloning the most effective soldiers </strong>has never been done.</p></li></ul><ul><li><p><strong>Genetic modification of humans </strong>has been done by one researcher in China, before he was arrested.</p></li></ul><ul><li><p>It is unclear whether <strong>colonialism</strong> counts as a technology. The Ming dynasty&#8217;s decision to stop their Treasure Fleets in the early 1400s delayed colonialism globally by about 50 years and may have impacted China&#8217;s developmental trajectory for centuries.</p></li></ul><ul><li><p><strong>Bell Labs </strong>invented or discovered the transistor, charged-coupled device, photovoltaic cell, information theory, Unix, C, and the cosmic microwave background radiation between 1945-1980.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-23" href="#footnote-23" target="_self">23</a> After the Bell System was broken up by antitrust laws in 1982, the research community there fragmented. It seems plausible that there are some technologies that were not invented because this unusually prolific center of innovation was destroyed.</p></li></ul><p>Most technologies are not banned, nor have had their progress stifled by regulation. Most technologies are also not as scary as AI: I have a hard time imagining how solar panels or ballpoint pens could constitute an x-risk. Scary-sounding technologies, like weapons of mass destruction or some kinds of medical research, often do face bans or regulations that make their development no longer worth it, and the bans sometimes work.</p><p>I don&#8217;t want to say that effective bans on scary-sounding technologies happen by default. When they do work, they are the result of a concerted effort. But making a ban on a new, potentially dangerous technology seems very doable without disrupting the rest of society.</p><p></p><h3>Maybe AI Will Be Different</h3><p>While this post is mostly about historical precedents of other technologies being stopped, it seems worth saying a few words on AI in particular. There are several reasons why AI might be different from other technologies:&nbsp;</p><ol><li><p>AI research is easier to do remotely than other emerging technologies.&nbsp;</p></li><li><p>Once an AI system is created, it can be transmitted easily, as software.</p></li><li><p>Simple economic models suggest that powerful AI would be extremely economically advantageous to whoever adopts it.</p></li></ol><p>All technologies are different. Some differences make regulations easier or harder, but none of these feel so different that they make regulation impossible:</p><ol><li><p>Law enforcement can enforce the law based on where the research is done or where the researchers live. The USA in particular has an expansive view of where its law can apply.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-24" href="#footnote-24" target="_self">24</a></p></li><li><p>Most proposed regulations focus on the hardware required to train powerful AI.</p></li><li><p>Policy makers do not know this. They know that someone is telling them this. They definitely do not know that they will get the economic promises of AGI on the timescales they care about, if they support this particular project. These promises are not that distinguishable from other technologies&#8217; hype.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-25" href="#footnote-25" target="_self">25</a></p></li></ol><p>There are also ways in which regulating AI is easier.</p><p>There are multiple stages of the supply chain where there are only one or a few companies in the world capable of cutting edge work. There are only a few actors who need to coordinate in order for regulation to be effective.</p><p>Current leading AI models require a lot of compute, which is capital intensive and easy to keep track of. This might change with enough improvements from algorithmic efficiency. But we should expect algorithmic progress to dramatically slow down in response to a long term pause on AI as capital and talent moves to other industries.</p><p>Lots of substances and items are regulated, and the details of this regulation vary widely based on what it is and what the government is trying to avoid.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-26" href="#footnote-26" target="_self">26</a> Regulating GPUs will have some unique challenges, but does not seem impossible under our current institutions.</p><p></p><h3>How Long of a Pause?</h3><p>Most of the historical evidence is for global pauses that have lasted about 50 years. This is useful evidence for discussing a 100 year pause. If &#8220;long term&#8221; means 1,000 years, then there is much less historical evidence. Matthew Barnett has argued that a regulatory ratchet within existing institutions might accomplish a 50 year pause in AI research, but something more dramatic would be needed for a 1,000 year pause.</p><p>I am skeptical that a global police state would be easier to maintain than more normal regulations for 1,000 years. My model for how to sustain institutions on this time scale is:</p><ol><li><p>Build an institution that lasts for a generation.</p></li><li><p>Convince the rising generation that this institution is a good thing to maintain.</p></li></ol><p>If you fail at (2), then it does not matter what institution was built. If not even the elite believe that the police state is a good thing, then it will not maintain itself.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-27" href="#footnote-27" target="_self">27</a> An institution which has less hard power, but is better at getting people to believe in it, is more likely to last 1,000 years.</p><p></p><h3>Conclusion</h3><p>Building AGI is an extremely uncertain endeavor. It might lead to Our Glorious Future. It might lead to human extinction. It might not even be possible. If we decide to not try to build AGI, the future seems much less uncertain. Society will continue to be clearly not optimal, but also far from dystopian. Making scientific, technological, economic, social, and political progress will continue to be hard, but people will continue to do it. We can continue to hope for at least marginal improvements for our children, and they for their children, long into the future.</p><p>It should not be surprising if a scary-sounding technology faces a regulatory ratchet that slows and then stops all progress in that field. This is not death or dystopia - it&#8217;s normal.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.aiimpacts.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading AI Impacts blog! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p><em>Thanks to Aaron Scher, Matthew Barnett, Rose Hadshar, Harlan Stewart, and Rick Korzekwa for useful discussion on this topic.</em></p><p><em>Preview image by Theen Moy: <a href="https://www.flickr.com/photos/theenmoy/8003177753">https://www.flickr.com/photos/theenmoy/8003177753</a>.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>This is not quite fair because the date range extends from the start of construction of one plant (Shearon Harris) to the end of construction of a different plant (Vogtle Unit 3). Vogtle Unit 3 started construction in 2013. There is also a nuclear power plant (Watts Bar Unit 2) that started construction in 1973 and was completed in 2016.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>In 1973, the Atomic Energy Commission projected that 55.8% of the USA&#8217;s electricity would come from nuclear power by 2000, which was lower than it had previously projected. This did not happen: nuclear power has accounted for about 20% of the USA&#8217;s electricity since the late 1980s.</p><p>Anthony Ripley. <em>A.E.C. Lowers Estimate Of Atom Power Growth. </em>New York Times. (1973) <a href="https://www.nytimes.com/1973/03/08/archives/aec-lowers-estimate-of-atom-power-growth.html">https://www.nytimes.com/1973/03/08/archives/aec-lowers-estimate-of-atom-power-growth.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Some prominent people, including Bertrand Russell, were advocating the creation of a World Authority to prevent the existential risk from nuclear weapons:</p><blockquote><p>A much more desirable way of securing world peace would be by a voluntary agreement among nations to pool their armed forces and submit to an agreed International Authority. This may seem, at present, a distant and Utopian prospect, but there are practical politicians who think otherwise. A World Authority, if it is to fulfill its function, must have a legislature and an executive and irresistible military power. All nations would have to agree to reduce national armed forces to the level necessary for internal police action. No nation should be allowed to retain nuclear weapons or any other means of wholesale destruction. &#8230; In a world where separate nations were disarmed, the military forces of the World Authority would not need to be very large and would not constitute an onerous burden upon the various constituent nations.</p></blockquote><p>Bertrand Russell. <em>Has Man A Future? </em>(1961) Quoted from Global Governance Forum. (Accessed October 17, 2023) <a href="https://globalgovernanceforum.org/visionary/bertrand-russell/">https://globalgovernanceforum.org/visionary/bertrand-russell/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Mark R. Lee. <em>The Regulatory Ratchet: Why Regulation Begets Regulation. </em>University of Cincinnati Law Review <strong>87.3</strong>. (2019) <a href="https://scholarship.law.uc.edu/cgi/viewcontent.cgi?article=1286&amp;context=uclr">https://scholarship.law.uc.edu/cgi/viewcontent.cgi?article=1286&amp;context=uclr</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>For example, one &#8220;new&#8221; design for a nuclear power plant is a molten salt reactor. One currently exists: TMSR-LF1, an experimental reactor producing 2 MW of thermal power in northwestern China. The design is based on the molten salt reactor experiment (MSRE) which produced 7 MW of thermal power at Oak Ridge National Lab in the USA from 1965-1969. </p><p>Similarly, China has a small modular reactor which began power production in 2021, HTR-PM. It is a pebble-bed reactor, based on a demonstration reactor in Germany (AVR), which ran from 1967-1988. </p><p>All other nuclear power plants use reactor types that are even older.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>I have previously estimated the direct value foregone by the prohibitively high costs of nuclear power in the USA. I also expect there to have been additional indirect value as a result of having less expensive electricity.</p><p><em>Resisted Technological Temptation: Nuclear Power. </em>AI Impacts Wiki. (Accessed October 18, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/nuclear_power">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/nuclear_power</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>Sam Altman. Twitter. (2022) <a href="https://twitter.com/sama/status/1540781762241974274?lang=en">https://twitter.com/sama/status/1540781762241974274?lang=en</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>The entire quote is:</p><blockquote><p>Second, if we never get AI, I expect the future to be short and grim. Most likely we kill ourselves with synthetic biology. If not, some combination of technological and economic stagnation, rising totalitarianism + illiberalism + mobocracy, fertility collapse and dysgenics will impoverish the world and accelerate its decaying institutional quality. I don&#8217;t spend much time worrying about any of these, because I think they&#8217;ll take a few generations to reach crisis level, and I expect technology to flip the gameboard well before then. But if we ban all gameboard-flipping technologies (the only other one I know is genetic enhancement, which is even more bannable), then we do end up with bioweapon catastrophe or social collapse. I&#8217;ve said before I think there&#8217;s a ~20% chance of AI destroying the world. But if we don&#8217;t get AI, I think there&#8217;s a 50%+ chance in the next 100 years we end up dead or careening towards Venezuela. That doesn&#8217;t mean I have to support AI accelerationism because 20% is smaller than 50%. Short, carefully-tailored pauses could improve the chance of AI going well by a lot, without increasing the risk of social collapse too much. But it&#8217;s something on my mind.</p></blockquote><p>Scott Alexander. <em>Pause for Thought: The AI Pause Debate.</em> Astral Codex Ten. (2023) <a href="https://www.astralcodexten.com/p/pause-for-thought-the-ai-pause-debate">https://www.astralcodexten.com/p/pause-for-thought-the-ai-pause-debate</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Nora Belrose. <em>AI Pause Will Likely Backfire. </em>EA Forum. (2023) <a href="https://forum.effectivealtruism.org/s/vw6tX5SyvTwMeSxJk/p/JYEAL8g7ArqGoTaX6">https://forum.effectivealtruism.org/s/vw6tX5SyvTwMeSxJk/p/JYEAL8g7ArqGoTaX6</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>The entire quote is:</p><blockquote><p>Note that I am not saying AI pause advocates necessarily directly advocate for a global police state. Instead, I am arguing that in order to sustain an indefinite pause for sufficiently long, it seems likely that we would need to create a worldwide police state, as otherwise the pause would fail in the long run. One can choose to &#8220;bite the bullet&#8221; and advocate a global police state in response to these arguments, but I&#8217;m not implying that&#8217;s the only option for AI pause advocates.</p><p>One reason to bite the bullet and advocate a global police state to pause AI indefinitely is that even if you think a global police state is bad, you could think that a global AI catastrophe is worse. I actually agree with this assessment in the case where an AI catastrophe is clearly imminent.</p><p>However, while I am not dogmatically opposed to the creation of a global police state, I still have a heuristic against pushing for one, and think that strong evidence is generally required to override this heuristic. I do not think the arguments for an AI catastrophe have so far met this threshold. The primary existing arguments for the catastrophe thesis appear abstract and divorced from any firm empirical evidence about the behavior of real AI systems.</p></blockquote><p>Matthew Barnett. <em>The possibility of an indefinite AI pause. </em>EA Forum. (2023) <a href="https://forum.effectivealtruism.org/s/vw6tX5SyvTwMeSxJk/p/k6K3iktCLCTHRMJsY">https://forum.effectivealtruism.org/s/vw6tX5SyvTwMeSxJk/p/k6K3iktCLCTHRMJsY</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Scott Alexander. <em>Pause for Thought: The AI Pause Debate.</em> Astral Codex Ten. (2023) <a href="https://www.astralcodexten.com/p/pause-for-thought-the-ai-pause-debate">https://www.astralcodexten.com/p/pause-for-thought-the-ai-pause-debate</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>Toby Ord estimates the biosecurity x-risk over the next century to be about 1/30 in <em>The Precipice</em>. The biosecurity community seems to be being more successful at fighting x-risk than the AI safety community. There are already extensive regulations in the countries where most research is done and major international treaties against developing biological weapons. If you think that AI is more dangerous than synthetic biology, then it does not make sense to advance AI in order to improve biosecurity. It is not even clear if increasingly powerful AI would make biosecurity better or worse.</p><p>For comparison, Toby Ord estimates the x-risk from asteroid impacts over the next century to be about 1/1,000,000. I interpret Sam Altman&#8217;s stated concern about asteroids as a proxy for all other existential risk. Otherwise, his risk estimates seem off by many orders of magnitude.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>I do not think that we have run out of human-achievable economic, technological, or scientific progress. The median person will likely be much wealthier in 100 years than today, even without AGI.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>Political and social trends in most countries over the last decade don&#8217;t seem good. Political and social trends in most countries over the last century seem wonderful. We should look at both when predicting the next century.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>The information sector, which includes both information technology and traditional media, accounts for 5.5% of the US GDP.</p><p><a href="https://www.bls.gov/emp/tables/output-by-major-industry-sector.htm">https://www.bls.gov/emp/tables/output-by-major-industry-sector.htm</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p><em>Examples of Progress for a Particular Technology Stopping. </em>AI Impacts Wiki. (Accessed October 19, 2023) <a href="https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping">https://wiki.aiimpacts.org/ai_timelines/examples_of_progress_for_a_particular_technology_stopping</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p><em>Resisted Technological Temptations Project. </em>AI Impacts Wiki. (Accessed October 18, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/resisted_technological_temptations_project">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/resisted_technological_temptations_project</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p><em>Resisted Technological Temptation: Nuclear Power. </em>AI Impacts Wiki. (Accessed October 18, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/nuclear_power">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/nuclear_power</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p><em>Resisted Technological Temptation: Geoengineering. </em>AI Impacts Wiki. (Accessed October 18, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/geoengineering">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/geoengineering</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p><em>Resisted Technological Temptation: Vaccine Challenge Trials. </em>AI Impacts Wiki. (Accessed October 18, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/vaccine_challenge_trials">https://wiki.aiimpacts.org/responses_to_ai/technological_inevitability/incentivized_technologies_not_pursued/vaccine_challenge_trials</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-21" href="#footnote-anchor-21" class="footnote-number" contenteditable="false" target="_self">21</a><div class="footnote-content"><p>I do not know what Israel&#8217;s nuclear program is like, or how much of it is the result of technology transfer from the US as opposed to indigenous innovation.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-22" href="#footnote-anchor-22" class="footnote-number" contenteditable="false" target="_self">22</a><div class="footnote-content"><p>Offensive biological weapons use is banned by the Geneva Protocol (1925) and development, production, acquisition, transfer, stockpiling &amp; use of biological weapons is banned by the Biological Weapons Convention (1972). In addition to the treaties, biological weapons seem to have a significant taboo against their use.</p><p>Michelle Bentley. <em>The Biological Weapons Taboo. </em>War on the Rocks. (2023) <a href="https://warontherocks.com/2023/10/the-biological-weapons-taboo/">https://warontherocks.com/2023/10/the-biological-weapons-taboo/</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-23" href="#footnote-anchor-23" class="footnote-number" contenteditable="false" target="_self">23</a><div class="footnote-content"><p>Iulia Georgescu. <em>Bringing back the golden days of Bell Labs. </em>Nature Reviews Physics <strong>4</strong>. (2022) p. 76-78. <a href="https://www.nature.com/articles/s42254-022-00426-6">https://www.nature.com/articles/s42254-022-00426-6</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-24" href="#footnote-anchor-24" class="footnote-number" contenteditable="false" target="_self">24</a><div class="footnote-content"><p>For example, Sam Bankman-Fried is being tried in a US federal court, despite having moved himself and his business to The Bahamas.</p><p>Another example involves the US Justice Department having FIFA officials from various countries arrested in Switzerland for corruption. &#8220;United States law allows for extradition and prosecution of foreign nationals under a number of statutes &#8230; In this case, she said, FIFA officials used the American banking system as part of their scheme.&#8221;<br><br>Stephanie Clifford and Matt Apuzzo. <em>After Indicting 14 Soccer Officials, U.S. Vows to End Graft in FIFA. </em>New York Times. (2015) <a href="https://www.nytimes.com/2015/05/28/sports/soccer/fifa-officials-arrested-on-corruption-charges-blatter-isnt-among-them.html">https://www.nytimes.com/2015/05/28/sports/soccer/fifa-officials-arrested-on-corruption-charges-blatter-isnt-among-them.html</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-25" href="#footnote-anchor-25" class="footnote-number" contenteditable="false" target="_self">25</a><div class="footnote-content"><p>For example, Project Excalibur promised to neutralize the threat of Soviet nuclear weapons by destroying dozens of ICBMs (with hundreds of warheads) as they launched. It ended up being infeasible.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-26" href="#footnote-anchor-26" class="footnote-number" contenteditable="false" target="_self">26</a><div class="footnote-content"><p><em>Examples of Regulated Things. </em>AI Impacts Wiki. (Accessed October 19, 2023) <a href="https://wiki.aiimpacts.org/responses_to_ai/examples_of_regulated_things">https://wiki.aiimpacts.org/responses_to_ai/examples_of_regulated_things</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-27" href="#footnote-anchor-27" class="footnote-number" contenteditable="false" target="_self">27</a><div class="footnote-content"><p>This is my oversimplified model of what happened to the USSR.</p><p></p></div></div>]]></content:encoded></item></channel></rss>