Late last month, several tech industry leaders, such as Elon Musk and Steve Wozniak, penned a letter urging the community to pause AI development. Stuart Russell was one of those who signed the letter. Russel, an AI expert and computer science professor at the University of California, continues to urge the community to pause AI development.
The letter, signed by AI expert Stuart Russell, appeared on The Future of Life‘s website, and it read in part:
“AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts. These protocols should ensure that systems adhering to them are safe beyond a reasonable doubt. This does not mean a pause on AI development in general, merely stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities.”
“AI research and development should be refocused on making today’s powerful, state-of-the-art systems more accurate, safe, interpretable, transparent, robust, aligned, trustworthy, and loyal.”
Russell is taking it a step further in his latest interview with Business Today; the AI expert compares unregulated AI development to a potential Chernobyl event. Russell and other experts are encouraging developers to ensure the safety of any AIs before releasing them publically.
The negative impact on humanity is still a big question, we’ve already seen one Belgian man take his life, and an AI named Eliza is being blamed. Italy is already blocking ChatGPT over privacy concerns, and Germany may follow. One technologist who is over the moon about AI is Robert Scoble; he has some interesting posts on his Twitter feed. The AI development world is the wild west of technology; a lot is happening, and it feels a bit chaotic.
This article was published by Alex Hernández in Techaeris.
Read more here: