🚀 Create your own Missions, build Guilds & turn users into real lifelong fans! ACT NOW!

Vitalik Buterin Proposes Global Limits to Halt Superintelligent AI Danger

Key Takeaways

  • Vitalik Buterin suggests a "soft pause" on industrial-scale computing to manage superintelligent AI risks;
  • Proposed controls include hardware restrictions requiring global approval via blockchain;
  • Buterin advocates cautious AI development if legal accountability proves inadequate.
Vitalik Buterin Proposes Global Limits to Halt Superintelligent AI Danger

Ethereum ETH $3,439.64 co-founder Vitalik Buterin explored the idea of temporarily limiting global computing power in a January 5 blog post to address the risks associated with superintelligent artificial intelligence (AI).

This concept, described as a "soft pause" involved reducing industrial-scale computing capacity by up to 99% for one to two years if dangerous forms of artificial intelligence emerge.

The goal is to allow humanity more time to prepare for potential challenges.

What is DeFi in Crypto? (Explained with Animations)

Did you know?

Want to get smarter & wealthier with crypto?

Subscribe - We publish new crypto explainer videos every week!

One suggestion involves installing specialized hardware controls in industrial-scale AI systems. These systems require regular approval from international bodies to continue functioning. Buterin explained that this process involved three signatures, which could be verified through blockchain technology.

Buterin noted that he would only support such measures if simpler solutions prove insufficient, such as holding developers legally responsible for harm caused by AI. Liability rules, while important, may not be enough to address the potential dangers of uncontrolled AI development.

This approach aligns with Buterin’s concept of "defensive accelerationism" (d/acc), which advocates a careful and measured approach to technological development. It contrasts with "effective accelerationism" (e/acc), which pushes for rapid and unrestricted innovation.

Superintelligent AI, which refers to systems that far exceed human intelligence in every field, has become a concern among technology leaders and researchers. In March 2023, more than 2,600 experts signed an open letter calling for a pause in AI development due to its possible threats to society and humanity.

Recently, Vitalik Buterin criticized Elon Musk's approach to debates on X. What did Buterin say? Read the full story.

Aaron S. Editor-In-Chief
Having completed a Master’s degree in Economics, Politics, and Cultures of the East Asia region, Aaron has written scientific papers analyzing the differences between Western and Collective forms of capitalism in the post-World War II era.
With close to a decade of experience in the FinTech industry, Aaron understands all of the biggest issues and struggles that crypto enthusiasts face. He’s a passionate analyst who is concerned with data-driven and fact-based content, as well as that which speaks to both Web3 natives and industry newcomers.
Aaron is the go-to person for everything and anything related to digital currencies. With a huge passion for blockchain & Web3 education, Aaron strives to transform the space as we know it, and make it more approachable to complete beginners.
Aaron has been quoted by multiple established outlets, and is a published author himself. Even during his free time, he enjoys researching the market trends, and looking for the next supernova.

Loading...
binance
×
Verified

$600 WELCOME BONUS

Earn Huge Exclusive Binance Learners Rewards
5.0 Rating