Oops, was that intentional?: Tighter nude filter revealed in iOS 26

Select Language

English

Down Icon

Select Country

Germany

Down Icon

Oops, was that intentional?: Tighter nude filter revealed in iOS 26

Oops, was that intentional?: Tighter nude filter revealed in iOS 26

Who hasn't experienced this? You're sitting on the toilet and accidentally answer a FaceTime call.

(Photo: IMAGO/Westend61)

iOS 26 apparently comes with an optimized filter for sensitive content. A beta tester discovered a feature that prevents you from accidentally revealing more of yourself than intended in FaceTime chats. You can continue if you want.

When Apple unveiled iOS 26 at the start of WWDC in early June, it announced, among other things, new parental controls. Apparently, this includes a feature that automatically pauses FaceTime calls if participants reveal too much bare skin. A beta tester writes on X that this also applies to adults in the current developer software. This could therefore be a general protection. In the final version, the feature could be limited to parental controls again.

The filter would certainly also be useful for adults. While many people intentionally reveal themselves during intimate FaceTime chats, it can also happen accidentally, for example, if you carelessly accept a video call at the wrong moment.

You might be startled because the video and audio automatically pause, but nothing else happens. You'll then see a message on the screen saying the video has been interrupted because you might be revealing something intimate. You then have the option to end the call or continue it.

Apple sees and knows nothing

The feature can be found in Settings under Screen Time - Communication Safety . There's already an automatic detection of nude videos that intervenes "before they're sent or viewed from your child's device." Currently, however, a video is simply blurred instead of paused.

Either way, it's reassuring that Apple emphasizes on its informational website that Communications Security uses exclusively machine learning on the iPhone to analyze sensitive content. Apple also "receives no indication that nudity has been detected and therefore does not have access to the photos or videos," it continues.

Source: ntv.de, kwe

n-tv.de

n-tv.de

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow