The SEC chief sees A.I. creating ‘conflicts of interest’ and maybe the next great financial crisis—unless we tackle ‘herding’

This post was originally published on this site

https://content.fortune.com/wp-content/uploads/2023/07/GettyImages-1252062373-e1689616790265.jpg?w=2048

Gary Gensler isn’t an A.I. skeptic. In a Monday speech at the National Press Club, the Securities and Exchange Commission (SEC) Chair said that he believes A.I. is “the most transformative technology of our time, on par with the internet and mass production of automobiles.” But that doesn’t mean that the famously tough regulator, who has presided over a “crypto crackdown” in the last 12 months, thinks A.I. will be a good thing for markets.

As is the case with any major technological advancement, Gensler fears there will be “macro challenges for society” as A.I. rolls out to the public. On Monday, the former MIT Sloan School of Management professor detailed the risks that he sees A.I. posing to the global economy and argued that governments will need to rewrite their rulebooks to deal with the threat. 

He warned of potential “significant” changes to the labor market and increasing competition between the U.S. and China as the two global superpowers seek to develop A.I. systems, but that was just the beginning of the potential issues.

“A.I. may heighten financial fragility,” he said, adding that the technology could end up being a key feature in the “after-action reports” of the next financial crisis. Make no mistake, he said, this is just as big a deal as the emergence of the internet was in the mid-1990s—or the invention of the modern automobile in 1886.

Gensler’s sweeping comments on A.I. relay his concerns about the fundamental shape of the market in the future, a departure from his recent enforcement actions regarding cryptocurrency, which fixate on a technical definition of what is and isn’t a security. A breakthrough in that matter occurred last week, when a judge issued a ruling on the cryptocurrency Ripple, effectively saying yes it is a security, except when it’s not (under specific circumstances). But on the subject of A.I., Gensler looked into his crystal ball and saw a problem with “herding.” 

A.I.’s big tech dominance and financial stability

A number of economists have warned that the rise of A.I. could lead to significant job losses, particularly for white collar workers, but SEC Chair Gensler is more concerned with the risk to financial markets as A.I. systems increase the “inherent network interconnectedness of the global financial system.”

He noted that the A.I. space could end up being dominated by a small number of big tech giants, which increases the odds of “herding” in the markets as investors all get the same signals from A.I. systems on whether to buy or not.

“It could promote herding with individual actors making similar decisions because they are getting the same signal from a base model or data aggregator. This could encourage monocultures,” he warned. 

To Gensler’s point, a 2001 study by Markus Konrad Brunnermeier, an economics professor at Princeton University, found that investors’ herding behaviors “help to explain” stock market crashes. And a 2022 study from Asad Ayoub and Ayman Balawi of the University of Pécs in Hungary, confirmed that herding behavior drives stock prices during both bear and bull markets. Gensler warned that this herding behavior would be even worse if a few big tech firms dominated the A.I. space. 

‘The possibility of one or even a small number of AI platforms dominating raises issues with regard to financial stability,” he said. “While at MIT, Lily Bailey and I wrote a paper about some of these issues called “Deep Learning and Financial Stability.” The recent advances in generative AI models make these challenges more likely.”

Privacy, intellectual property, and conflicts of interest

Gensler went on to highlight a few more key issues he’s been pondering amid A.I.’s rollout in his Monday speech. 

First, he warned that the technology brings forward some serious challenges when it comes to data privacy and intellectual property, noting that the Hollywood writers and actors’ doubles strike is currently seeking to address some of those issues. Screenwriters are striking over compensation disputes and the use of A.I. in entertainment productions, arguing the technology could replace them, and is using their work to train to do so.

Gensler admitted in his speech that “we’re all helping train the parameters of A.I. models,”which leads to the question: “Whose data is it?” That debate “is playing out right now,” the SEC chair said, adding that he will be monitoring it closely moving forward. 

“For the SEC, the challenge here is to promote competitive, efficient markets in the face of what could be dominant base layers at the center of the capital markets. I believe we closely have to assess this so that we can continue to promote competition, transparency, and fair access to markets,” he said.

Gensler also warned brokers and financial advisers that A.I. could increase potential conflicts of interest in their businesses, hinting at a potential crackdown with using AI to steer clients toward specific financial products. 

“If the optimization function in the AI system is taking the interest of the platform into consideration as well as the interest of the customer, this can lead to conflicts of interest. In finance, conflicts may arise to the extent that advisers or brokers are optimizing to place their interests ahead of their investors’ interests,” he said.

The warning comes after Gensler asked his staff back in June to make recommendations on what should be done to prevent these conflicts of interest, which they’ve yet to make public. 

Finally, with A.I. creating a litany of potential issues for the SEC and government officials, Gensler argued current regulations are not “sufficient” and will need to be “updated.”

“Many of the challenges to financial stability that AI may pose in the future…will require new thinking on system-wide or macro-prudential policy interventions,” he said. 

To help update regulations for this new era, Gensler proposed a somewhat ironic solution: A.I.

“While recognizing the challenges, we at the SEC also could benefit from staff making greater use of AI in their market surveillance, disclosure review, exams, enforcement, and economic analysis,” he said.