Facial recognition in apps: What could go wrong?

Uploading selfies to apps and editing them with filters has long been normal for many users. But self-portraits can end up in databases that are already accessed by security authorities and states.

At first, it's great fun: you upload a selfie to an app, press a few buttons, and your own face looks older, younger, pimple-free, cute, or creepy. Thanks to the photo filters that social networks like Snapchat and Instagram have been offering for years, and with which you can put crowns on yourself or give yourself dog tongues, wildly changing your own face has long been normal, especially for young users.

There is also money to be made from optimized selfies: Many apps of this kind are financed by advertising and in-app purchases, and their download figures run into the hundreds of millions. But what happens to the uploaded photos and the information that can be generated from them?

Not all selfie apps are a privacy concern. FaceApp in particular, which became known to a wide audience in 2019 because many celebrities shared their edited selfies on social networks, shows efforts to treat users' content discreetly. According to the privacy policy, photos are only stored briefly on servers, where they are even stored in end-to-end encrypted form.

According to the current test series by mobilsicher.de, however, no recommendation can be made for any of the tested selfie editing apps. They all integrate third-party software components that enable personalized advertising. Furthermore, they normalize uploading photos of one's own face to online services about which one knows little and whose data processing practices are often dubious.

All information about the test series as well as the assessment of the test results can be found at mobilsicher.de.

This text has been adapted from mobilsicher.de under the terms of the Creative Commons Attribution-No Derivative Works 3.0 Germany license.

Further information