Source: @Annu297
Vincent Müller and Nick Bostrom of FHI conducted a poll of four groups of AI experts in 2012-13. Combined, the median date by which they gave a 10% chance of human-level AI was 2022, and the median date by which they gave a 50% chance of human-level AI was 2040.
Details
According to Bostrom, the participants were asked when they expect “human-level machine intelligence” to be developed, defined as “one that can carry out most human professions at least as well as a typical human”. The results were as follows. The groups surveyed are described below.
Response rate | 10% | 50% | 90% | |
PT-AI | 43% | 2023 | 2048 | 2080 |
AGI | 65% | 2022 | 2040 | 2065 |
EETN | 10% | 2020 | 2050 | 2093 |
TOP100 | 29% | 2022 | 2040 | 2075 |
Combined | 31% | 2022 | 2040 | 2075 |
Figure 1: Median dates for different confidence levels for human-level AI, given by different groups of surveyed experts (from Bostrom, 2014).
Surveyed groups:
PT-AI: Participants at the 2011 Philosophy and Theory of AI conference. By the list of speakers, this appears to have contained a fairly even mixture of philosophers, computer scientists and others (e.g. cognitive scientists). According to the paper, they tend to be interested in theory, to not do technical AI work, and to be skeptical of AI progress being easy.
AGI: Participants at the 2012 AGI-12 and AGI Impacts conferences. These people mostly do technical work.
EETN: Members of the Greek Association for Artificial Intelligence, which only accepts published AI researchers.
TOP100: The 100 top authors in artificial intelligence, by citation, in all years, according to Microsoft Academic Search in May 2013. These people mostly do technical AI work, and tend to be relatively old and based in the US.
Source: AI Impacts
To get a more accurate assessment of the opinion of leading researchers in the field, I turned to the Fellows of the American Association for Artificial Intelligence, a group of researchers who are recognized as having made significant, sustained contributions to the field.
In early March 2016, AAAI sent out an anonymous survey on my behalf, posing the following question to 193 fellows:
“In his book, Nick Bostrom has defined Superintelligence as ‘an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills.’ When do you think we will achieve Superintelligence?”
…In essence, according to 92.5 percent of the respondents, superintelligence is beyond the foreseeable horizon.
See also Oren Etzioni on Building Intelligent Machines
From Oren Etzioni’s presentation at the O’Reilly AI conference, September 2016: