This post was originally published on this site
https://i-invdn-com.investing.com/trkd-images/LYNXMPEJ4O0WK_L.jpg(Reuters) – OpenAI, the startup behind the wildly popular ChatGPT artificial intelligence chatbot, said Thursday it will award 10 equal grants from a fund of $1 million for experiments that explore democratic processes to determine how AI software should be governed to address bias and other factors.
The $100,000 grants will go to recipients who present compelling frameworks for answering such questions as whether AI ought to criticize public figures and what it should consider the “median individual” in the world, according to a blog post announcing the fund.
Critics of AI systems, like ChatGPT, say they have inherent bias due to the inputs used to shape their views. Users have found examples of racist or sexist outputs from AI software, depending on which queries they are answering. Concerns are growing that AI working alongside search engines like Alphabet (NASDAQ:GOOGL) Inc’s Google and Microsoft (NASDAQ:MSFT) Corp’s Bing may produce incorrect information in a convincing fashion.
OpenAI, backed by $10 billion from Microsoft, has been leading the call for regulation of AI. Yet it recently threatened to pull out of the European Union over proposed rules that it said may be too onerous.
“The current draft of the EU AI Act would be over-regulating, but we have heard it’s going to get pulled back,” OpenAI’s chief executive Sam Altman told Reuters. “They are still talking about it.”
The startup’s grants would not go far toward funding much AI research. Salaries for AI engineers and others working in the red-hot sector easily top $100,000 and can run as high as $300,000 or more.
AI systems “should benefit all of humanity and be shaped to be as inclusive as possible,” OpenAI said in the blog post. “We are launching this grant program to take a first step in this direction.”
The San Francisco startup said results of the funding could shape its own views on AI governance, though it said no recommendations would be “binding.”