Bumble Announces ‘Private Detector' So You Don't Have to Deal With Lewd Pics

When a user slides into your DMs with a nude image, you don't have to have to look at it.

As apps work to become safer platforms for people to connect, the dating app Bumble is announcing a "private detector," and it's pretty much what you're thinking it is.

A week after Uber announced new safety measures helping riders make sure they are getting in the correct car in the wake of a 21-year-old woman's death, Bumble announced the new safety feature so daters aren't taken off-guard with nude or lewd images.

The app's creators built out artificial intelligence that can detect when a nude or lewd image in sent from one user to another. 

Bumble says the app feature scans the image in real time, automatically blurs it, and then sends the user an alert that they're being sent an inappropriate image. 

The user gets to decide whether to view that, or block the unwelcome junk. The user can also report the sender. 

Bumble also said it's 98 percent accurate. 

The app's founders, Whitney Wolfe Herd and Andrey Andreev, started Bumble five years ago with the aim to make a dating app experience better for women.

Wolfe Herd, who just landed TIME's list of the World's Most Influential People, started the dating app after ultimately resigning from Tinder and suing Tinder's co-founder for sexual harassment.

The new "private detector" feature will be available June 2019 on Badoo, Bumble, Chappy and Lumen.

Contact Us