The technological singularity is the hypothetical future emergence of greater-than-human superintelligence through technological means. Since the capabilities of such intelligence would be difficult for an unaided human mind to comprehend, the occurrence of a technological singularity is seen as an intellectual event horizon, beyond which events cannot be predicted or understood.
Proponents of the singularity typically state that an “intelligence explosion”, where superintelligences design successive generations of increasingly powerful minds, might occur very quickly and might not stop until the agent’s cognitive abilities greatly surpass that of any human.
The term was popularized by science fiction writer Vernor Vinge, who argues that artificial intelligence, human biological enhancement, or brain-computer interfaces could be possible causes of the singularity. The specific term “singularity” as a description for a phenomenon of technological acceleration causing an eventual unpredictable outcome in society was coined by mathematician and physicist Stanislaw Ulam as early as 1958, when he wrote of a conversation with John von Neumann concerning the “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” The concept has also been popularized by futurists, such as Ray Kurzweil. Proponents expect the singularity to occur some time in the 21st century, although their estimates vary. >Wiki<