This browser does not support the Video element.
OAKLAND, Calif. - Some of the people likely to profit from high-tech artificial intelligence are now asking everyone involved in it to take a pause and think about the potential damage that could be wrought if it goes out of control.
Thursday, the artificial intelligence search engine ChatGPT will observe only its fifth month of existence and is already changing the world.
Midjourney is one of the latest A.I. websites where anyone can fashion their own creations. There were photos of house fires in Oakland that never happened. Also, a photo of a seemingly sad nonexistent person whose money was tied up in the Silicon Valley Bank failure. Then there was a fake photo of Elon Musk in a Tesla racing down the streets of San Francisco.
On YouTube, you can see videos made by researchers from the University of Washington that show video clones of famous people saying anything, in any background, no matter how false or outrageous.
"The cat is out of the bag," one expert said.
A petition featuring over 1,300 signatures from tech experts to famous venture capitalists to academics seeks to pause A.I. operations for six months, citing an "out-of-control race" in which developers can't "understand, predict, or reliably control" what the ramifications of A.I. may be. They claim A.I. presents a profound risk to society and humanity.
Notable signatories include Apple co-founder Steve Wozniak, Tesla's Elon Musk and former presidential candidate Andrew Yang.
"We need to think about the pros, the cons [and] the consequences that could be there and take a deep look," said Domingo Guerra, head of trust at Incode, an identity verification firm that uses biometrics to verify users' online identity.
"We need some sort of guardrails or best practices, or dos and don'ts that we can agree to as an industry," Guerra added. "This version of A.I. could be weaponized to overwhelm the world in disinformation. It is like a modern arms race."
A.I. could impact over 300 million jobs worldwide, according to an analysis by Goldman Sachs.