n March 22, the United States think tank Future of Life Institute (FOLI) released an open letter signed by over 2,600 tech leaders and researchers, including Tesla CEO Elon Musk, Apple co-founder Steve Wozniak, and several AI CEOs, CTOs, and researchers.

The letter urged for a temporary halt on the development of artificial intelligence (AI), citing concerns about potential risks to society and humanity posed by training AI systems more powerful than GPT-4. The FOLI called on all AI companies to suspend such development for at least six months due to the profound implications of human-competitive intelligence.

The Future of Life Institute stated in their letter that the development of advanced AI could lead to a significant transformation in the history of life on Earth, and thus it should be approached with appropriate care and resources. However, the institute expressed concerns that the necessary planning and management for this level of development are not currently taking place.

OpenAI's chatbot, powered by artificial intelligence, has reached its latest iteration with GPT-4. This version, launched on March 14, has achieved high scores on challenging U.S. high school and law exams, ranking within the 90th percentile.

Compared to the original version of ChatGPT, GPT-4 is said to be ten times more advanced. However, according to the Future of Life Institute, this development is part of an out-of-control race among AI firms to create increasingly powerful AI systems that no one - including their creators - can fully understand, predict or control with reliability.

Some of the primary concerns addressed by the letter were the possibilities of machines inundating information channels with propaganda and untruths and completely automating all available employment opportunities.

The Future of Life Institute went beyond these concerns, suggesting that the entrepreneurial pursuits of AI companies could pose an existential threat. The letter questioned whether developing nonhuman minds that could outnumber, outsmart, and replace humans was a risk worth taking and warned of the potential loss of control over civilization. Moreover, the letter emphasized that such crucial decisions should not be solely in the hands of unelected tech leaders.

The Future of Life Institute also supported OpenAI founder Sam Altman's recent statement suggesting an independent review may be necessary before training future AI systems. In a blog post on February 24, Altman emphasized the importance of preparing for the development of artificial general intelligence (AGI) and artificial superintelligence (ASI) robots.

However, not all AI experts have endorsed the call for a pause on AI development. For instance, Ben Goertzel, CEO of SingularityNET, stated on Twitter on March 29 that language learning models (LLMs) would not develop into AGIs, which are yet to see significant advancements.

On the other hand, Ben Goertzel suggested that the focus should be on slowing down research and development of bioweapons and nuclear weapons instead of AI. While language learning models like ChatGPT are just one example of AI, deep fake technology has been widely used to create convincing images, audio, and video hoaxes. Additionally, AI-generated artwork has also been a growing concern regarding its potential violation of copyright laws.

During a shareholder call on March 28, Galaxy Digital CEO Mike Novogratz expressed his surprise at the regulatory attention being given to cryptocurrencies compared to artificial intelligence. He opined that the government has it completely upside-down and that not enough attention is being given to regulating AI. FOLI, on the other hand, has suggested that if a pause in AI development is not quickly implemented, governments should step in with a moratorium. The institute has argued that this pause should be public and verifiable, including all key actors.

Got something to say about Elon Musk’s comment on AI development or anything else? Write to us or join the discussion on our Telegram channel


Mar 30, 2023
Crypto News

More from 

Crypto News


View All

Join Our Newsletter and Get the Latest
Posts to Your Inbox

No spam ever. Read our Privacy Policy
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.