When the man at the forefront of some of the most cutting-edge enterprises in the world warns you about, well, some potentially disastrous technological dangers, you should probably listen, right? So pay attention to a warning from Elon Musk, the founder of Tesla, Paypal and SpaceX. During an interview on CNBC this past week he warned about artificial intelligence—you know, computers thinking for themselves. "I think there's things that are potentially dangerous out there. ...There's been movies about this, like 'Terminator,'" he said on CNBC's "Closing Bell". "There's some scary outcomes and we should try to make sure the outcomes are good, not bad."



It's kind of an ironic comment from him, since he just invested in an artificial intelligence company, Vicarious, a start-up that is working on enabling machines to mimic the human brain.

Read MoreWill AI become a major Wall Street power broker? "It's not from the standpoint of actually trying to make any investment return," he explained. "It's purely I would just like to keep an eye on what's going on with artificial intelligence." Musk's warning is almost identical to that of another really, really smart guy—renowned physicist Stephen Hawking. "Success in creating A.I. would be the biggest event in human history," Hawking wrote in a co-authored column in early May. "Unfortunately, it might also be the last, unless we learn how to avoid the risks."



He reiterated the warning recently in a pretty hysterical bit with comedian John Oliver on his new HBO show, "Last Week Tonight." Hawking pointed out that artificial intelligence could design improvements to itself and outsmart humans.

"I know you're trying to get people to be cautious there but why should I not be excited about fighting a robot?" asked Oliver. "You would lose," said Hawking. Read MoreComputers will be like humans by 2029

Not all smart folks are worried, however. Roger McNamee, a well-known investor in various technological efforts, poked fun at the notion. "I don't think just being a billionaire means that the things you think out loud are important," the Elevation Partners co-founder said. "I would like to worry about the problems that are killing us today as opposed to the ones that may kill us in 20 years. If you want something to worry about, you know, just read the newspaper. ... There's a good chance we will have polluted the earth beyond repair way before they can get any of this A.I. stuff to work. So, I'm looking at it and going, 'Seriously? Let's keep our eye on the ball'."