Robots & Artificial IntelligenceProminent transhumanist on Artificial General Intelligence: ‘We must stop everything. We are not ready.’ Posted on March 26, 2025 by Administrator At last week’s SXSW conference, prominent transhumanist Eliezer Yudkowsky said that if the development of artificial general intelligence is not stopped immediately across the globe, humanity may be destroyed. “We must stop everything,” Yudkowsky said during a panel titled “How to Make AGI (Artificial General Intelligence) Not Kill Everyone.”