US SEC developing rules on AI ‘conflicts of interest’


Wall Street’s top regulator is developing rules to govern the use of artificial intelligence on trading platforms, which poses a risk of conflicts of interest, the agency chief said in a speech on Monday.

The U.S. Securities and Exchange Commission will also need “new thinking” to confront challenges to financial stability presented by the use of technologies such as predictive analytics and machine learning, according to Chair Gary Gensler.

Gensler’s remarks are part of a broader U.S. government effort to promote what officials call “responsible” innovation while also managing what they say are threats the emerging technology poses to public safety.

If a trading platform’s AI system considers the interest of both the platform and its customers, “this can lead to conflicts of interest,” Gensler said, according to a copy of prepared remarks, adding that he had tasked SEC staff with recommending new regulatory proposals to address this.

AI could also amplify the world financial system’s interconnectedness, something for which current risk management models may not be prepared, Gensler said.

“Many of the challenges to financial stability that AI may pose in the future … will require new thinking on system-wide or macro-prudential policy interventions.”

Gensler’s remarks echoed statements he has made in recent months on managing risks created by the use of AI in finance.

According to the SEC’s most recent agenda for developing new regulations, officials are considering possible rule proposals, which could be unveiled later this year, to govern the potential for conflicts of interest in the use of AI and machine learning by investment advisers and broker-dealers.


Wall Street’s top regulator is developing rules to govern the use of artificial intelligence on trading platforms, which poses a risk of conflicts of interest, the agency chief said in a speech on Monday.

The U.S. Securities and Exchange Commission will also need “new thinking” to confront challenges to financial stability presented by the use of technologies such as predictive analytics and machine learning, according to Chair Gary Gensler.

Gensler’s remarks are part of a broader U.S. government effort to promote what officials call “responsible” innovation while also managing what they say are threats the emerging technology poses to public safety.

If a trading platform’s AI system considers the interest of both the platform and its customers, “this can lead to conflicts of interest,” Gensler said, according to a copy of prepared remarks, adding that he had tasked SEC staff with recommending new regulatory proposals to address this.

AI could also amplify the world financial system’s interconnectedness, something for which current risk management models may not be prepared, Gensler said.

“Many of the challenges to financial stability that AI may pose in the future … will require new thinking on system-wide or macro-prudential policy interventions.”

Gensler’s remarks echoed statements he has made in recent months on managing risks created by the use of AI in finance.

According to the SEC’s most recent agenda for developing new regulations, officials are considering possible rule proposals, which could be unveiled later this year, to govern the potential for conflicts of interest in the use of AI and machine learning by investment advisers and broker-dealers.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
artificial intelligenceconflict of InteresconflictsdevelopinginterestLatestmachine learningRulesrules on AISECTechTechnoblenderTrading PlatformsUS developing rules
Comments (0)
Add Comment