Today we’re presenting the third installment of my conversation with Naval Ravikant about existential risks. This interview first appeared in March, as two back-to-back episodes of the After On Podcast (which now features 50 unhurried conversations with world-class thinkers, founders, and scientists). Naval is one of tech’s most successful angel investors and the founder of multiple startups—including seed-stage investment platform AngelList. Please check out parts one and two of this conversation if you missed them. Otherwise, you can press play on the embedded audio player or pull up the transcript, both of which are below.
In this segment, Ravikant and I move on from yesterday’s topic of AI risk to the dangers inherent in the rise of synthetic biology, or synbio. Here, I should disclose that I am a hopeless synbio fanboy. I’ve gotten to know many of the field’s top figures through my podcast, and I essentially revere both their work and its potential. But even the most starry-eyed synbio booster cannot ignore the technology’s annihilating potential.
A big topic in today’s segment is a genetic hack performed on H5N1 flu. This nasty bug kills a higher proportion of those infected than even Ebola (as discussed in some detail in this piece on Ars yesterday). But since its wild form is barely even contagious to humans, it has historically killed very few of us. But in 2011, independent research teams in Wisconsin and Holland modified H5N1’s genome to make it virulently contagious.
This didn’t present a huge immediate risk. Virtually no one was in a position to do something along these lines back then, and the people who pulled it off were virologists—not terrorists. But genetic hacking has since gotten radically easier. And it will continue to do so at a frenetic speed.
In this context, Ravikant and I discuss a quote from Hot Zone author Richard Preston: “The main thing that stands between the human species and the creation of a super virus, is a sense of responsibility among the individual biologists.”
If the list of empowered biologists is extremely short—and includes nobody corruptible, overconfident, moody, or even slightly incompetent—we might safely rely upon that sense of responsibility. But like computing, synbio is an exponential technology. And the history of exponential technologies is one of constant proliferation. That was fantastic when calculators were swept from science labs to middle schools. It was transformative when computers migrated from elite universities to countless private homes. But what happens when bioweaponization techniques spread from the pinnacles of science to the average high school bio lab?
One of the world’s most influential bioengineers—George Church—posted this fascinating and timely opinion piece to Ars today, contemplating where synbio’s improvement curve might take us and how we might mitigate the risks that come with it. I strongly recommend it to anyone interested in this subject.
This special edition of the Ars Technicast podcast can be accessed in the following places:
iTunes:
https://itunes.apple.com/us/podcast/the-ars-technicast/id522504024?mt=2 (Might take several hours after publication to appear.)
RSS:
http://arstechnica.libsyn.com/rss
Stitcher
http://www.stitcher.com/podcast/ars-technicast/the-ars-technicast
Libsyn:
http://directory.libsyn.com/shows/view/id/arstechnica
https://arstechnica.com/?p=1523289