Meta Shuts down Facebook’s Facial Recognition System

This post was originally published on this site

https://i-invdn-com.investing.com/news/facebook_2_M_1440049069.jpg

Investing.com — Facebook parent company Meta Platforms Inc (NASDAQ:FB) said Tuesday that it is shutting down the social media app’s facial recognition system as part of a company-wide plan to restrict the use of facial recognition in its products. 

The Meta Platforms Inc (NASDAQ:FB) which still trade under the Facebook name though it will eventually change tickers, dipped 0.6% on Tuesday.

The company said that over one-third of its daily active users had opted into its facial recognition setting and were able to be recognized, with the move likely to result in more than a billion people’s facial recognition templates being deleted.

“As part of this change, people who have opted in to our Face Recognition setting will no longer be automatically recognized in photos and videos, and we will delete the facial recognition template used to identify them,” the company stated. 

However, the change will also impact Automatic Alt Text (AAT), which creates image descriptions for blind and visually impaired people. AAT will no longer be able to include the names of people but will function normally otherwise.

Meta cited concerns about the place of facial technology in society as one of the reasons for its removal. However, the company said it still sees facial recognition technology as a powerful tool, including when people need to verify their identity or to prevent fraud and impersonation.

Meta went on to say, “regulators are still in the process of providing a clear set of rules governing its use. Amid this ongoing uncertainty, we believe that limiting the use of facial recognition to a narrow set of use cases is appropriate.”

Meta, formerly Facebook, has come under pressure in the past month after a former employee released a plethora of company documents containing information such as the company knowing Instagram was harming certain user’s mental health and that an algorithm changed increased divisiveness on the platform. It is one of the latest in a number of  scandals the company has been involved in, in the past.