Since 2019, Bumble has been using machine learning to protect its users from suggestive photos. Synced , the feature checks images sent by games to see if they represent inappropriate content. It’s primarily designed to capture unwanted nude photos, but it can also flag shirtless selfies and images of guns – both of which aren’t allowed on Bumble. If there is a positive match, the app will blur the offending image so you can choose to view it, block it, or report the person who sent it to you.
in one Bumble announced that it is open-source Private Detector and is making the framework available on . “We hope the feature will be embraced by the broader tech community as we work together to make the internet a safer place,” the company said, acknowledging that it’s just one of many players on the online -Dating market is.
Unwanted sexual advances are a common reality for many women both online and in the real world. A found that 57 percent of women felt harassed on the dating apps they used. More recently, a from the UK found that 76 percent of girls aged 12 to 18 were sent unsolicited nude photos. The problem also extends beyond dating apps, with apps like their own solutions.
This article was previously published on Source link