<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[SynthCog: Glossary]]></title><description><![CDATA[Helpful terms, concepts, and definitions]]></description><link>https://www.synthcog.blog/s/glossary</link><generator>Substack</generator><lastBuildDate>Thu, 14 May 2026 00:38:40 GMT</lastBuildDate><atom:link href="https://www.synthcog.blog/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Lucid Beast Inc.]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[synthcog@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[synthcog@substack.com]]></itunes:email><itunes:name><![CDATA[DK]]></itunes:name></itunes:owner><itunes:author><![CDATA[DK]]></itunes:author><googleplay:owner><![CDATA[synthcog@substack.com]]></googleplay:owner><googleplay:email><![CDATA[synthcog@substack.com]]></googleplay:email><googleplay:author><![CDATA[DK]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Terms and Concepts]]></title><description><![CDATA[Terms and concepts commonly used in discussions of AI and AGI as well as some specific to the SynthCog blog.]]></description><link>https://www.synthcog.blog/p/terms-and-concepts</link><guid isPermaLink="false">https://www.synthcog.blog/p/terms-and-concepts</guid><pubDate>Mon, 17 Apr 2023 20:57:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!T1-S!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!T1-S!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 424w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 848w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 1272w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!T1-S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png" width="1312" height="928" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:928,&quot;width&quot;:1312,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1764426,&quot;alt&quot;:&quot;ancient book carved from wood with engraved writing&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="ancient book carved from wood with engraved writing" title="ancient book carved from wood with engraved writing" srcset="https://substackcdn.com/image/fetch/$s_!T1-S!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 424w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 848w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 1272w, https://substackcdn.com/image/fetch/$s_!T1-S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F303066c2-d7a7-4df9-af37-ccbb1dd8bd7d_1312x928.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>AGI System </h4><p>A software/hardware platform that employs artificial general intelligence technology.</p><h4>AI Doubter</h4><p>One who is skeptical that machine intelligence at human-level or beyond will be possible until sometime in the distant future if at all.</p><h4>AI Dystopian</h4><p>One who is alarmed by the possibility of human-level or beyond machine intelligence and believes that it will pose a potentially existential threat to humanity.</p><h4>AI Pragmatist</h4><p>One who is cautiously optimistic about the possibility of human-level or beyond machine intelligence and the potential benefits it may provide at some point in the future while keeping in mind the potential pitfalls and dangers that may come with it.</p><h4>AI System</h4><p>A software/hardware platform that employs AI technology.</p><h4>AI Utopian</h4><p>One who is excited about the possibility of human-level or beyond machine intelligence and believes it will provide an extraordinary bounty for humanity.</p><h4>Artificial General Intelligence (AGI)</h4><p>The field concerned with the creation of machines that approach or surpass the level and quality of human intelligence.</p><h4>Artificial Intelligence (AI)</h4><p>The contemporary field that includes machine learning and symbolic logic and typically involves techniques such as the statistical analysis of massive data sets, weighted evaluation networks, self-modifying feedback loops, and hierarchical relationship networks.</p><h4>Bounded Rationality</h4><p>A concept of rational decision making that recognizes the constraints inherent in any real-world situation in which constraints will exist on cognitive ability, knowledge, and time.</p><h4>Cognition</h4><p>The mental action or process of acquiring knowledge and understanding through thought, experience, and the senses (as opposed to intelligence, which is the result of those processes).</p><h4>Cognition Engine</h4><p>A software or hardware module that is capable of synthetic cognition. An AGI system would employ a cognition engine to exhibit intelligence.</p><h4>Cognitive Entity</h4><p>A being, biological or synthetic, that is capable of cognition and exhibits intelligence.</p><h4>Conclusion of Convenience</h4><p>The conclusion that the outcome of an advantageous conjecture will occur within the lifetime of the person making the conjecture.</p><h4>Control Problem</h4><p>The concern that an AGI system, particularly a superintelligent system, would be difficult or impossible to control, contain, or terminate and would instead attempt to control, contain, or terminate others to preserve itself.</p><h4>Cooperative Inverse Reinforcement Learning (CIRL)</h4><p>An AI learning model involving a human and an artificial agent in which the agent is not given the reward parameters of its utility function but instead must infer them from observing the human. The intent is to maximize the realization of human values.</p><h4>Genetic Engineering</h4><p>The direct manipulation of an organism's genetic code using technology.</p><h4>Goal-Content Integrity</h4><p>The drive that motivates an intelligent agent to prevent alterations of its present ultimate goals so as to ensure that those goals are more likely to be achieved by its future self, whatever form that future self takes.</p><h4>GOUFI</h4><p>A model of intelligence as a phenomenon based on attaining goals and governed by an algorithm designed to maximize the attainment of those goals. GOUFI is an acronym for <strong>G</strong>oal-attainment <strong>O</strong>ptimization driven by a <strong><a href="https://www.synthcog.blog/i/115475426/utility-function">U</a></strong><a href="https://www.synthcog.blog/i/115475426/utility-function">tility </a><strong><a href="https://www.synthcog.blog/i/115475426/utility-function">F</a></strong><a href="https://www.synthcog.blog/i/115475426/utility-function">unction</a> as <strong>I</strong>ntelligence.</p><h4>Instrumental Convergence Thesis</h4><p>The conjecture that for a wide range of potential ultimate goals pursued by an intelligent entity, we can identify a number of subgoals that are likely to be pursued. These subgoals have the potential to be detrimental to humanity even if the ultimate goals themselves are harmless.</p><h4>Instrumental Goal</h4><p>An intermediary subgoal formed to facilitate achieving an ultimate goal. There may not necessarily be an obvious correlation between an instrumental goal and the ultimate goals it facilitates.</p><h4>Instrumental Rationality</h4><p>Rationality that is confined to some subdomain of endeavor or circumstances.</p><h4>Intelligence</h4><p>Intelligence is that quality which allows an entity to solve a wide range of deductive and inductive problems, extract and prioritize information from the environment, infer causal as well as correlative relationships from both small and large data sets over many known and novel domains, generalize knowledge from a known domain to another known or novel domain, extrapolate probable outcomes from both factual and counterfactual circumstances, recognize in its own cognition both the potential for fallacies and the fallacies themselves, synthesize existing knowledge to form original concepts, and acquire awareness of its own cognition and of itself as an independent and unique entity distinct from other entities and from its environment.</p><h4>Intelligence Explosion</h4><p>The supposition that if humans were able to develop an AGI system even slightly more intelligent than themselves, then that system would be able to do the same, and then the subsequent system would be able to do the same, etc. This would result in a superintelligent system far beyond the level of humans.</p><h4>Laplace's Demon</h4><p>A postulated infinitely vast intelligence able to know the exact movements of every atom in the universe. This intelligence is at the heart of a thought experiment in which it would be possible to determine all future events in a deterministic universe.</p><h4>Large Language Model (LLM)</h4><p>A Large Language Model is a neural network built using a predictive deep learning model with billions of parameters. They are typically trained on extremely large data sets with self-supervised or semi-supervised learning, and they excel at a wide range of tasks.</p><h4>Law of Accelerating Returns</h4><p>The premise that analyzing the history of technology demonstrates that technological change is exponential, and that when any technological barrier is reached, a new paradigm will be found to surmount it.</p><h4>Machine Intelligence</h4><p>Intelligence anywhere on the spectrum from AI to superintelligence that is exhibited by a machine rather than a biological organism.</p><h4>Machine Learning</h4><p>A field of computer science that involves algorithms that automatically improve through experience. These algorithms frequently involve the use of artificial neural networks, although other techniques are also used.</p><h4>Molecular Assembler</h4><p>A hypothetical machine that operates at the nanoscale and can manipulate individual atoms and molecules with atomic precision.</p><h4>Nanotechnology</h4><p>Originally coined to refer to technology that was created at the nanoscale (1 billionth of a meter) and operated on individual atoms and molecules. Later co-opted to refer to science that involves materials with at least one-dimensional attribute at the nanoscale.</p><h4>Orthogonality Thesis</h4><p>The conjecture that intelligence and goals are not directly correlated, and consequently one can't assume that some level of intelligence would guarantee a particular subset of goals and exclude some other subset of goals. Conversely, one can't assume that a subset of goals would be guaranteed to be pursued or not pursued by any particular level of intelligence.</p><h4>Paperclip Maximizer</h4><p>A thought experiment involving an AGI system that&#8217;s designed with the benign goal of maximizing the number of paperclips it creates. Inevitably, it tries to turn the entire universe into paperclips. The thought experiment is meant to demonstrate that the behavior of such a system is not predictable, and that even benign goals can result in a dire effect on humanity.</p><h4>Rational Agent</h4><p>A model of human behavior originally used in economic theory to represent a consumer and the choices that consumer is most likely to make. Later used in various theories and speculation involving artificial general intelligence.</p><h4>Scientific Method</h4><p>The method of discovery that involves careful observation and applying rigorous skepticism to what is observed given that cognitive assumptions can distort how one interprets the observation. It involves formulating via induction hypotheses based on such observations; experimental and measurement-based testing of deductions drawn from the hypotheses; and refinement (or elimination) of the hypotheses based on the experimental findings.</p><h4>Superintelligence</h4><p>A level of intelligence radically beyond that of any human.</p><h4>Synthetic Cognition</h4><p>Cognition that is synthesized rather than occurring naturally, i.e. not biological in nature.</p><h4>Synthetic Cognitive Entity</h4><p>A strictly non-biological being that is capable of cognition and exhibits intelligence.</p><h4>Technological Singularity</h4><p>A moment in time when the rate of technological change advances so rapidly that we're unable to conceive of what might lie past it. At this point all our conceptions of humanity and society quickly fall away as the nature of technology and our relationship with it engender a new reality, a future discontinuous with our past. Many believe that the driving force behind this event is the accelerating evolution of intelligence itself, particularly the development of artificial general intelligence.</p><h4>Unintended Obfuscation</h4><p>A situation that arises when a group of individuals talk past each other but are unaware that they&#8217;re doing so. It frequently occurs when discussing complex subjects. The individuals who don&#8217;t understand the subject offer essentially incoherent questions or statements, while the individuals who do understand the subject try to answer or respond without acknowledging the incoherence.</p><h4>Utility Function</h4><p>The algorithm guiding an intelligent agent's actions and designed to maximize the agent's ability to achieve its goals.</p><h4>Value Alignment Problem</h4><p>The issue of negative outcomes arising due to an AGI system's goals not aligning with human values.</p>]]></content:encoded></item><item><title><![CDATA[Logical Fallacies]]></title><description><![CDATA[Compendium of logical fallacies that confuse and misdirect and that frequently arise in discussions of AI and AGI.]]></description><link>https://www.synthcog.blog/p/logical-fallacies</link><guid isPermaLink="false">https://www.synthcog.blog/p/logical-fallacies</guid><pubDate>Mon, 17 Apr 2023 20:57:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0L5d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0L5d!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 424w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 848w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 1272w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0L5d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png" width="1312" height="928" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:928,&quot;width&quot;:1312,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1828592,&quot;alt&quot;:&quot;Post with signs pointing in many different directions on a path in the desert with another sign nearby pointing down the path&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Post with signs pointing in many different directions on a path in the desert with another sign nearby pointing down the path" title="Post with signs pointing in many different directions on a path in the desert with another sign nearby pointing down the path" srcset="https://substackcdn.com/image/fetch/$s_!0L5d!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 424w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 848w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 1272w, https://substackcdn.com/image/fetch/$s_!0L5d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3366b3c-f26e-406a-9361-3ecf6094e689_1312x928.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Common Logical Fallacy Definitions</h3><h4>Ad Hominem / Circumstantial Ad Hominem</h4><p>An attack on an opponent personally rather than on the basis of their argument. A <strong>Circumstantial Ad Hominem</strong> fallacy is an argument that an opponent's conclusion is wrong due to the opponent's personal situation or perceived benefit from their conclusion rather than the merits of the conclusion itself.</p><h4>Appeal to Accomplishment</h4><p>Accepting an assertion based on the accomplishments of the proposer.</p><h4>Appeal to Authority</h4><p>Accepting an assertion as true because of the position or authority of the person asserting it.</p><h4>Appeal to Consequences</h4><p>A conclusion based on the positive or negative outcome of a premise rather than on the merit of the premise itself.</p><h4>Appeal to Emotion</h4><p>An argument employing the manipulation of emotions rather than the merits of its premise.</p><h4>Appeal to Ignorance</h4><p>Assuming that a claim is true because it has not been proven false, or vice versa.</p><h4>Appeal to the Stone</h4><p>Dismissing an argument as absurd without demonstrating proof of its absurdity.</p><h4>Argument From Incredulity</h4><p>A conclusion that because something seems beyond the capacity of its proposer to explain or quantify, it must be supernatural in nature.</p><h4>Broken Window</h4><p>An argument that disregards the opportunity cost of its premise. For example, stating that breaking a window is good because it generates income for a window repairperson, while disregarding the fact that the money and other resources spent on the new window is no longer available to be spent on other things.</p><h4>Cherry Picking</h4><p>Basing an argument on a chosen subset of data that boosts the argument while ignoring the equally valid data that counter the argument.</p><h4>Circular Argument</h4><p>Stating what is essentially the conclusion of an argument as a basis for that argument.</p><h4>Equivocation</h4><p>The misleading use of a term with multiple meanings to advantageously obfuscate its use in an argument.</p><h4>False Analogy</h4><p>Employing an analogy in an argument that is poorly suited to the parameters of that argument.</p><h4>False Dichotomy</h4><p>Oversimplifying the range of options in a premise or conclusion, typically by assuming only one of two options is possible.</p><h4>Hazy Generalization</h4><p>Basing a broad conclusion on an insufficient sample or stating a conclusion without all of the information required to do so.</p><h4>Ipse Dixit</h4><p>An assertion without proof or evidence, literally &#8220;He himself said it." It&#8217;s similar to an <strong>Appeal to Authority</strong>, with the authority being the person promoting the assertion.</p><h4>Ludic Fallacy</h4><p>The assumption that simplified models can accurately predict real world processes. This assumption involves a failure to take into account the unknown unknowns when concluding the probability of events or outcomes.</p><h4>Mind Projection Fallacy</h4><p>Arguments in which the premise is based on subjective judgments of an object or situation rather than on objective observation. The assumption is made that these subjective judgments are inherent properties rather than personal perceptions, that other people share these perceptions, and that any who don't are irrational or misinformed.</p><h4>Moving the Goalposts</h4><p>Dismissing a response to an argument by changing the parameters of that argument or the requirements of a successful counter argument.</p><h4>Onus Probandi</h4><p>Shifting the burden of proof from the one who makes a claim to the one who refutes or questions it.</p><h4>Reification</h4><p>Treating a hypothetical construct as if it were a real-world object, event, or situation.</p><h4>Straw man</h4><p>An argument based on misrepresentation of an opponent's position.</p><p></p><h3>Some New Logical Fallacy Definitions</h3><h4>Bad Engineer</h4><p>Basing a premise on a hypothetical technical situation that could only arise through completely inept engineering. Typically used by non-engineers to make a point, without recognizing that the technology in question would never be designed as hypothesized by any engineer other than a fictional one.</p><h4>Subjective Inflation</h4><p>Inflating the importance of one object, event, or situation over another without providing evidence or objective justification for such a hierarchy.</p><h4>Unproven Basis</h4><p>Stating an initial, unproven proposition then basing subsequent conclusions on that proposition as if it had been proven. This typically involves a preliminary conjecture supported only by hand-waving arguments followed by conclusions that only follow if that conjecture had in fact been proven. It's related to the <strong>Ipse Dixit</strong> fallacy, which is an assertion without proof or evidence. The difference is that the fallacy in <strong>Ipse Dixit</strong> is the initial assertion and the fallacy in <strong>Unproven Basis</strong> is making the subsequent assertions whose only evidence is that initial statement.</p>]]></content:encoded></item><item><title><![CDATA[Cognitive Biases]]></title><description><![CDATA[A compendium of cognitive biases we've inherited form our ancestors that manipulate our thinking and that frequently crop up in discussions of AI/AGI.]]></description><link>https://www.synthcog.blog/p/cognitive-biases</link><guid isPermaLink="false">https://www.synthcog.blog/p/cognitive-biases</guid><dc:creator><![CDATA[DK]]></dc:creator><pubDate>Mon, 17 Apr 2023 20:55:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!v0uL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!v0uL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!v0uL!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!v0uL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg" width="1312" height="928" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:928,&quot;width&quot;:1312,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:732626,&quot;alt&quot;:&quot;ancient human ancestor holding hand puppet of a modern human&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="ancient human ancestor holding hand puppet of a modern human" title="ancient human ancestor holding hand puppet of a modern human" srcset="https://substackcdn.com/image/fetch/$s_!v0uL!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!v0uL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe11ba6fd-4cdc-4697-93fa-2c1894925dba_1312x928.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>Anthropomorphism</h4><p>Projecting human characteristics onto non-human entities or objects.</p><h4>Availability Cascade</h4><p>The self-reinforcing dynamic in which a belief gains increasing plausibility though its increasing repetition in the public sphere.</p><h4>Bias Blind Spot</h4><p>Assuming oneself is less biased than others or more easily identifying cognitive biases in others rather than oneself.</p><h4>Confirmation Bias</h4><p>Noticing and remembering information that supports one's opinions and filtering out information that contradicts them.</p><h4>Congruence Bias</h4><p>Considering only evidence directly supporting one's hypothesis rather than evidence that indirectly supports it or disproves it while disregarding any alternative hypotheses.</p><h4>Expectation Bias</h4><p>Believing and promoting data that agree with one's expectations and disbelieving or disregarding data that conflicts with one's expectations.</p><h4>Framing Effect</h4><p>Drawing different conclusions from the same data depending on the manner in which the data is presented.</p><h4>Gambler's Fallacy</h4><p>Assuming the probability of future events is altered by past events that actually have no probabilistic connection.</p><h4>Groupthink</h4><p>Dysfunctional reasoning that results when members of a group suppress or ignore alternate viewpoints that conflict with the group consensus.</p><h4>Halo Effect</h4><p>Perceiving a person's positive or negative traits in one area as applicable to an unrelated area.</p><h4>Hostile Attribution Bias</h4><p>Interpreting the behaviors or opinions of others to be hostile in nature when no hostility is objectively present.</p><h4>Illusion of Control</h4><p>Overestimating one's ability to influence external events.</p><h4>Illusory Correlation</h4><p>Perceiving an illusory relationship between unrelated events.</p><h4>Illusory Truth Effect</h4><p>Belief in statements based on how easy they are to understand or how often they've been promoted rather than the underlying evidence supporting them.</p><h4>Just-World Hypothesis</h4><p>Rationalizing otherwise inexplicable injustice or negative outcomes based on a general belief that society or the universe is fundamentally just.</p><h4>Law of the Instrument</h4><p>Over-reliance on familiar tools and methods rather than potentially more appropriate alternate ones. Related to the maxim, "If you only have a hammer then everything looks like a nail."</p><h4>Na&#239;ve Realism</h4><p>Believing that one perceives reality objectively and without bias, that other rational people will perceive it in the same way, and that anyone who perceives it differently is biased, uninformed, or otherwise irrational.</p><h4>Neglect of Probability</h4><p>Statements or conclusions that disregard probability in a premise and its alternatives.</p><h4>Normalcy Bias</h4><p>The tendency to disregard the possibility or severity of a disaster that has never happened before.</p><h4>Observer-Expectancy Effect</h4><p>Unconsciously manipulating or misinterpreting experiments or data to better fit an expected result.</p><h4>Omission Bias</h4><p>Judging harmful actions as worse than equally harmful inaction.</p><h4>Optimism Bias</h4><p>Overestimating the likelihood of favorable or desired outcomes.</p><h4>Overconfidence Effect</h4><p>The tendency to have more confidence in one's opinions or abilities than is objectively warranted.</p><h4>Parkinson's Law of Triviality</h4><p>Giving disproportionate weight to trivial but tractable issues rather than important but complex issues.</p><h4>Pessimism Bias</h4><p>Overestimating the likelihood of unfavorable or undesired outcomes.</p><h4>Planning Fallacy</h4><p>Underestimating the time it will take to reach a goal.</p><h4>Pro-Innovation Bias</h4><p>Excessive optimism towards an innovation's benefit to society without consideration of any potential negative impact.</p><h4>Projection Bias</h4><p>Overestimating how much other people now or in the future share one's current beliefs and behaviors or how much someone's future self will share his or her current beliefs and behaviors.</p><h4>Pseudocertainty Effect</h4><p>Certainty of a conclusion with insufficient evidence to support that certainty. Frequently builds upon itself with each successive stage of a multi-stage premise. Related to Unproven Basis logical fallacy.</p><h4>Reactive Devaluation</h4><p>Dismissing or devaluing a premise or conclusion only because it's believed to have originated with an adversary.</p><h4>Salience Bias</h4><p>Giving undue weight to data or behavior that is abnormal or emotionally striking but scarce while disregarding or devaluing that which is unremarkable but abundant.</p><h4>Selective Perception</h4><p>Perceiving that something is true or that connections exist between unrelated events based on one's beliefs rather than on objective evidence.</p><h4>Self-Serving Bias</h4><p>Disproportionately recognizing one's successes over one's failures, perceiving ambiguous outcomes as successes, and interpreting ambiguous information as only promoting one's opinions and conclusions rather than contradicting them or being neutral.</p><h4>Shared Information Bias</h4><p>The tendency of groups to concentrate on information with which all members are already familiar and spend little time or effort focusing on information with which few members are already familiar.</p><h4>Surrogation</h4><p>Mistaking the measurement of progress to a goal for the goal itself.</p><h4>Zero-Sum Bias</h4><p>Perceiving situations as a zero-sum game between participants regardless of whether or not that is objectively the case.</p>]]></content:encoded></item></channel></rss>