Ethereum
This concept, described as a "soft pause" involved reducing industrial-scale computing capacity by up to 99% for one to two years if dangerous forms of artificial intelligence emerge.
The goal is to allow humanity more time to prepare for potential challenges.
Did you know?
Subscribe - We publish new crypto explainer videos every week!
What is DeFi in Crypto? (Explained with Animations)
One suggestion involves installing specialized hardware controls in industrial-scale AI systems. These systems require regular approval from international bodies to continue functioning. Buterin explained that this process involved three signatures, which could be verified through blockchain technology.
Buterin noted that he would only support such measures if simpler solutions prove insufficient, such as holding developers legally responsible for harm caused by AI. Liability rules, while important, may not be enough to address the potential dangers of uncontrolled AI development.
This approach aligns with Buterin’s concept of "defensive accelerationism" (d/acc), which advocates a careful and measured approach to technological development. It contrasts with "effective accelerationism" (e/acc), which pushes for rapid and unrestricted innovation.
Superintelligent AI, which refers to systems that far exceed human intelligence in every field, has become a concern among technology leaders and researchers. In March 2023, more than 2,600 experts signed an open letter calling for a pause in AI development due to its possible threats to society and humanity.
Recently, Vitalik Buterin criticized Elon Musk's approach to debates on X. What did Buterin say? Read the full story.