The special nude photo filter for iMessage will soon be on all Apple user devices. However, this is a very controversial feature. This will be loaded onto every iPhone with the new update. It was not for nothing that it was long criticized - this nude photo filter was actually supposed to help protect children. Nevertheless, there are also fierce accusations, because the end-to-end encryption is thus bypassed by the iPhone manufacturer. Basically, it was intended that parents of children under thirteen would receive a message if the child was exposed to pornographic content. Now a revised version is coming to iPhones and Apple devices.
The exchange of images and content
Images can be exchanged via iMessage. In the future, these will be checked carefully by the local system. Allegedly, however, the manufacturer keeps the end-to-end encryption. At the moment it looks like the child can decide for themselves whether or not reports are passed on to the parents about the identified nude pictures. If such an abnormality occurs, such a classified photo is displayed in a veiled manner. A notice for possible sensitive content is automatically generated. However, the images can be displayed normally again by both the recipient and the sender.
Is privacy still a priority?
Basically, Apple as a company is always very concerned about the privacy of its customers. Now, however, the question arises as to what the change in the new function is all about and what consequences it will have. Basically, the goal is to stop sexual abuse of children and to stop the dissemination of images and video footage. It starts in the USA, Apple automatically analyzes all pictures and photos of iPhone users. This also means Mac computers and iPads. Supposedly, no one should be wrongly suspected.
Will the encryption be removed?
The question now arises as to whether Apple’s encryption actually remains intact or not. The encrypted data in Apple's iCloud should supposedly remain encrypted. Apple uses a special technology for this. It recognizes special motifs. If an image corresponds to the predefined list, Apple immediately manipulates this file. However, this happens only after these suspicious files have actually been identified. If such images are actually discovered, it is reportedly initially only Apple employees who deal with them. The images are then reported to U.S. authorities. The scope of the project is not yet clear. Nevertheless, international cooperation is also expected in the future.
However, if the technology proves to be error-prone, enormous problems are guaranteed to arise. If a photo is harmless and is still identified as problematic, it can no longer be easily sent in Apple Messenger. The incorrect classification of pornographic content would be fatal. Correct identification of the sexual material is also felt to be just as intrusive. The fight against child abuse online has now been going on for two decades. The technical race continues. In the meantime, Apple has set itself the goal of moving the detection mainly to the end devices, including iPads and iPhones. Artificial intelligence is also used to curb potential abuse.