Ex Machine did a great job of exploring the control problem for AGI.
Nick Bostrom's book Superintelligence spooked Elon Musk and motivated others like Bill Gates and Steven Hawking to take AI seriously. Once we invent some form of AGI, how do you keep it in control? Will it want to get out? Do we keep it in some server room in an underground bunker? How do we know if its trying to get out? If its an attractive girl, maybe it will try to seduce men.
thanks! I've already ordered it. I have to finish a good book on AI first - Superintelligence: Paths, Dangers, Strategies - first, but that's next on my list!
I dunno, philosophers might be warning about huge existential crises, but nobody's gonna listen.
https://www.amazon.com/dp/B00LOOCGB2/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1
There is actually an answer to this question. He read this book
I read it to, and I can honestly say it is the scariest thing I have ever read.
/r/ControlProblem
How do we ensure that future artificial superintelligence has a positive impact on the world?
"People who say that real AI researchers don’t believe in safety research are now just empirically wrong." - Scott Alexander
"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." - Eliezer Yudkowsky
Check out our new wiki
Talks at Google: Ensuring Smarter-than-Human Intelligence has a Positive Outcome
WaitButWhy on Superintelligence (Lengthy but entertaining & accessible. Also read part 2)
Three areas of research on the Control Problem (in-depth, organizes different topics w/ links to academic papers)
^(Bot created by /u /el_loke - )^Feedback
It won't matter if you're lazy or productive when the time for basic income arrives - automation will have replaced the majority of the workforce. Before you say "but stable-boys became mechanics during the industrial revolution, and innovation in the economy will create new jobs for them", I have news for you: it won't. It's not just physical labor being replaced - it's mental labor that's being replaced.
The Rise of the Machines – Why Automation is Different this Time
<em>Superintelligence: Paths, Dangers, Strategies</em>, by Nick Bostrom
Why does that statement not hold up? Check out Superintelligence. Specialized machine learning is not the same as strong generalized AI.
thats a big question, this book answers everything http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom-ebook/dp/B00LOOCGB2
for better or worse, its something i want to see in my lifetime
i.e. yes lots of funding probably, it just happens to be secret. You are not clever enough or worthy enough to know about it. There is a supreme danger here of AI Risk, where the machines become smarter than us and we become noise after that. If you doubt what I say, read Superintelligence by Nick Bostrom, or Smarter Than Us by Stuart Armstrong.