All three new or expanded elements of Apple’s technology could be valuable in addressing and stopping the exploitation of children, a pernicious problem that needs addressing. Also, Apple assures its customers that it will keep private communications unreadable.
However – and this is a BIG however – their new initiative also raises serious privacy concerns, including the constant scanning of our iPhone and iCloud images and the intrusion of Big Tech and, eventually, government.
The new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. For that to happen, they will scan images. So how advanced is the technology? Will it flag a random baby in the tub picture and potentially lead to serious problems for a well-meaning parent?
Then, of course, there is the issue of professional fine art nude photography. How will it be treated? Will it be flagged? Suppose you’re a professional photographer who shoots nudes or boudoir images. Will you be affected if you send or store anything on an Apple product? The new security initiative, while highly commendable, raises a lot of questions for many professional photographers.
You can expect the new initiative to take effect with the company’s release of updated operating systems, iOS15 for iPhones, and updates for the iPad, Apple Watch, and Mac computers. You always have the option to delay upgrading your operating system. Still, if you’ve worked with Apple technology for a while, you’ll know that you will eventually have to upgrade.
Let’s take a closer look at what the new security initiative could mean for professional photographers.