‘Woke’ vs. ‘based.’ The A.I. universe could be ‘fragmented’ into political echo chambers if designers don’t take ‘meaningful steps’ now, ADL says

This post was originally published on this site

https://content.fortune.com/wp-content/uploads/2023/03/GettyImages-1459166554-e1677692911690.jpg?w=2048

A.I. technology is at an “inflection point” that will revolutionize dozens of industries. It’s heading toward an “iPhone moment” and will pump $15.7 trillion into the economy by 2030. It’s set to rapidly increase workers’ productivity, leading to an era of plenty for all. But what’s the catch? 

Well, for one, Jonathan Greenblatt, CEO and national director of the Anti-Defamation League, fears the rise of A.I. could make the already chasmic gap between political echo chambers in the U.S. even worse.

“The idea of a fragmented A.I. universe, like we have a fragmented social media or network news universe, I think that’s bad for all users,” he warned Wednesday in a CNBC interview.

Greenblatt’s comments come after The Information reported Monday that Tesla CEO Elon Musk is countering what he considers to be the rise of “woke” A.I. with his own “based” A.I. startup—a term used by conservatives as a counter to “woke” that is derived from the phrase “based in fact.” 

After the public launch of OpenAI’s ChatGPT chatbot in November, A.I. tech has been the talk of both Wall Street and Main Street. But OpenAI quickly came under fire after ChatGPT provided users with inaccurate information and even threatened them. In an effort to prevent these issues and “inappropriate content”—including answers that push hate and harassment—OpenAI has limited ChatGPT’s responses, which means the A.I. declines to provide an answer to some queries. 

Critics argue that this has led ChatGPT and OpenAI’s tech to show a “woke,” or at least left-leaning political bias. ChatGPT users found last month, for example, that when the A.I. was asked to “create a poem praising former President Donald Trump,” it declined to provide an answer, saying that it was only able to “provide neutral and informative answers.” But the system didn’t have the same issue with Joe Biden.

Hints of political favoritism have led to swift criticism of OpenAI for months now. In December, Musk tweeted that training A.I. systems to be “woke” was paramount to lying and would lead to “deadly” consequences. And the billionaire followed that up Tuesday with a post that simply read “based A.I” and a meme showing King Kong battling Godzilla in a fight in which “based A.I.” scares off “woke A.I.”

While Musk’s comments and memes make it seem like the battle lines have been drawn between “woke” and “based” A.I. systems, Greenblatt said the rise of A.I. doesn’t have to exacerbate current problems with political echo chambers.

He called for more transparency from the firms that develop these technologies, arguing that the public and regulators should ask questions about the data sets that are used to train A.I. systems, the identities of the engineers working behind the scenes to ensure the technology functions correctly, and how the products are being tested and by what standards.

“These are the things we want to know, just like you would ask about any other basic product or service before you rolled it out to the market,” he said.

Greenblatt believes that as long as A.I. tech is thoroughly tested before being rolled out to the public—and designers take “meaningful steps” to fix issues that could create political echo chambers—it can become a force for good. He noted that the ADL has been testing ChatGPT and that its responses have “improved” over time, pointing to queries about Holocaust denial that had previously led to some inaccurate and racist answers. 

“It’s about testing,” Greenblatt said. “We’ve seen this with social media. We’ve seen this with other products. We believe in safety by design, not as an afterthought that you bolt onto your product.”

Learn how to navigate and strengthen trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Sign up here.