Scientists Claim There Is a 5% Chance AI Could Lead to Human Extinction
AI researchers generally believe that the development of superintelligent AI poses an insignificant yet very small risk of human extinction. The largest survey of AI researchers shows that about 58% of them believe there is a 5% chance of leading to human extinction or other extreme negative AI-related outcomes. Researchers have a wide range of disagreements over the timeline for future AI technological milestones and feel uncertain about the social consequences that AI may bring. The survey also reveals the urgent concerns AI researchers have regarding AI-driven scenarios such as deepfakes, manipulation of public opinion, and weaponization.