
As part of its larger commitment to fight “cyberflashing,” dating app Bumble is using its AI tool that detects unsolicited lewd images. First debuting in 2019, Private Detector (let’s take a moment to let that name sink in) fades out nudes sent through the Bumble app, giving the user on the receiving end a choice of whether or not to view the image. to open.
“While the number of users who submit lewd images through our apps is fortunately a negligible minority – just 0.1% – our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tuned to achieve the best possible performance on the task,” the company wrote in a press release.
Now available on Github, a refined version of the AI is available for commercial use, distribution and modification. While it’s not exactly cutting edge technology to develop a model that detects nudes, it’s something smaller companies probably don’t have the time to develop on their own. So other dating apps (or any other product that people can send dick pics to, i.e. the entire internet?) could feasibly integrate this technology into their own products, protecting users from unwanted lewd content.
Since the release of Private Detector, Bumble has also worked with US lawmakers to enforce legal ramifications for sending unsolicited nude photos.
“There is a need to address this issue outside of Bumble’s product ecosystem and engage in a larger conversation about how to tackle the problem of unsolicited lewd photos – also known as cyberflashing – to make the internet safer and friendlier for everyone.” Bumble added.
When Bumble first introduced this AI, the company claimed it had 98% accuracy.