Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's an interesting angle, though I'd personally rate the higher risk being humans who do have agency over technology. Technology is a human ability multiplier, which is a problem if the ability includes destruction. The 20th century brought nuclear weapons, for example, which have been kept remarkably out of use through, I think, a combination of luck and extreme suppression. It just so happens that some of the engineering challenges in making a nuclear weapon are particularly hard to DIY, even when you understand the science (fuel enrichment is apparently the most significant bottleneck, with a secondary bottleneck at actual weapon construction). And then governments make extensive efforts to keep any sort of nuclear DIY scene from developing, to make sure that even harmless science-fair type versions of practical nuclear knowledge don't arise "in the wild".

Will that all also be true for 21st-century destructive technologies? If any technology appears where one person could kill 20 million people with it, either it will have to be very hard to DIY, we'll have to be very good at suppressing it, or likely, both simultaneously.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: