differential technological development
{{Short description|Strategy of technology governance}}
{{Third-party sources|date=October 2024}}
Differential technological development is a strategy of technology governance aiming to decrease risks from emerging technologies by influencing the sequence in which they are developed. Using this strategy, societies would strive to delay the development of harmful technologies and their applications while accelerating the development of beneficial technologies, especially those that offer protection against harmful technologies.{{cite journal|author=Bostrom, Nick|author-link=Nick Bostrom|year=2002|journal=Journal of Evolution and Technology|title=Existential Risks: Analyzing Human Extinction Scenarios|url=http://jetpress.org/volume9/risks.html}}[https://ora.ox.ac.uk/objects/uuid:827452c3-fcba-41b8-86b0-407293e6617c Oxford Research Archive]{{Cite book|last=Ord|first=Toby|title=The Precipice: Existential Risk and the Future of Humanity|publisher=Bloomsbury Publishing|year=2020|isbn=978-1526600219|location=United Kingdom|pages=200}}
History of the idea
Differential technological development was initially proposed by philosopher Nick Bostrom in 2002 and he applied the idea to the governance of artificial intelligence in his 2014 book Superintelligence: Paths, Dangers, Strategies.{{cite book|last=Bostrom|first=Nick|title=Superintelligence: Paths, Dangers, Strategies|date=2014|publisher=Oxford University Press|isbn=978-0199678112|location=Oxford|pages=229–237}} The strategy was also endorsed by philosopher Toby Ord in his 2020 book The Precipice: Existential Risk and the Future of Humanity, who writes that "While it may be too difficult to prevent the development of a risky technology, we may be able to reduce existential risk by speeding up the development of protective technologies relative to dangerous ones."{{Cite magazine|last=Purtill|first=Corinne|title=How Close Is Humanity to the Edge?|url=https://www.newyorker.com/culture/annals-of-inquiry/how-close-is-humanity-to-the-edge|access-date=2020-11-27|magazine=The New Yorker|date=21 November 2020 |language=en-us}}
Informal discussion
Paul Christiano believes that while accelerating technological progress appears to be one of the best ways to improve human welfare in the next few decades, a faster rate of growth cannot be equally important for the far future because growth must eventually saturate due to physical limits. Hence, from the perspective of the far future, differential technological development appears more crucial.{{cite news|last1=Christiano|first1=Paul|title=On Progress and Prosperity|url=http://effective-altruism.com/ea/9f/on_progress_and_prosperity/|website=Effective Altruism Forum|access-date=21 October 2014|date=15 Oct 2014}}
Inspired by Bostrom's proposal, Luke Muehlhauser and Anna Salamon suggested a more general project of "differential intellectual progress", in which society advances its wisdom, philosophical sophistication, and understanding of risks faster than its technological power.{{cite web |last=Muehlhauser|first=Luke|author2=Anna Salamon|title=Intelligence Explosion: Evidence and Import|date=2012|pages=18–19|url=http://intelligence.org/files/IE-EI.pdf|access-date=29 November 2013|archive-url=https://web.archive.org/web/20141026105011/http://intelligence.org/files/IE-EI.pdf|archive-date=26 October 2014|url-status=dead}}{{cite web |last=Muehlhauser|first=Luke|title=Facing the Intelligence Explosion|date=2013|publisher=Machine Intelligence Research Institute|url=http://intelligenceexplosion.com/2012/ai-the-problem-with-solutions/|access-date=29 November 2013}} Brian Tomasik has expanded on this notion.{{cite web|last1=Tomasik|first1=Brian|title=Differential Intellectual Progress as a Positive-Sum Project|url=http://foundational-research.org/differential-intellectual-progress-as-a-positive-sum-project/|website=Foundational Research Institute|access-date=18 February 2016|date=23 Oct 2013}}
See also
References
{{Reflist}}
{{Future of Humanity Institute}}
{{emerging technologies|topics=yes}}
Category:Technology forecasting
{{tech-stub}}
{{future-stub}}