imagine that they'll be incredibly dangerous and that you'll in fact have a non-plural refraction problem. And in the book, we talk about proliferation, obviously Dr. Kissinger was one of the prime architects of why we're still alive today, in terms of mutual assured destruction and so forth and so on. And what will happen in AGI, there'll be a relatively small number of these systems because they're very expensive, I mean like immensely expensive and they'll be guarded. And in particular, you're not going to want some terrible terrorists to say to the system, tell me how I can kill a million people who are not my race. So we are playing with technologies which eventually will have the level of impact and concern that nuclear weapons did 70 years ago. So maybe I'll jump in for a sec, I totally agree with Eric and I might broaden it a little bit more beyond not just killer robots aren't on the list, but so much of the everyday understanding of AI comes from science fiction. And what (9/45)
You are viewing a single comment's thread from: