With support solutions for the home and office, My Computer Works is here to help you get back to your life.
You might have seen recent headlines declaring Apple’s plans to scan all the images on your phone. In August, Apple announced a child protection feature that will roll out with its upcoming OS update.
Child exploitation, we can all agree, is a bad thing. But why is there so much pushback? Let’s explore Apple’s reasoning and the public response.
Though Apple may have good intentions, it’s rollout of the technology is concerning. Apple states on its corporate website, “We want to help protect children from predators who use communication tools to recruit and exploit them.”
Apple states these features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
However, as of September 3, 2021, Apple is back to the drawing board after receiving major backlash from consumers, businesses and privacy protection groups. Apple states, “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Watchdogs say Apple’s new feature is a corporate invasion of our privacy, which is alarming. There’s growing concerns about abuse of power which opens the door for big tech and government surveillance of the data on our phones.
Organizations like the American Civil Liberties Union (ACLU) are concerned the government will take advantage of these changes to conduct intrusive surveillance and give the government an easy gateway to spy on our phones and devices.
Apple is reportedly using a scanning system too complicated for the average consumer to wrap their head around, so what it boils down to is, can I trust Apple to be ethical with this approach and not misuse my data?
Apple plans to use an artificial intelligence (AI) powered algorithm to scan the photos you upload to your iCloud and compare them against a database of known Child Sexual Abuse Material (CSAM) images stored in child safety organization databases.
This doesn’t mean Apple is spying on every photo in your iCloud. To Apple, the images will appear encrypted but if the software detects that at least 30 images on a phone match images in those databases, the photos will be decrypted and Apple can manually sort through them to verify their nature, then alert authorities if they appear to fall under CSAM parameters.
Apple also plans to utilize the scanning tool for text messages. For example, if a user is searching for, sends or receives sensitive content via text that falls under CSAM on their phone, the photo will blur and a warning notification will pop up. If this happens on a child’s phone, their parents will also receive an alert on their phone.
To stop Apple from scanning your photos, here’s how to turn off the new feature once it rolls out:
iCloud photos are already turned on by default when you purchase your phone, so you’ll need to turn it off manually, especially if you’re a new user and you don’t want this feature.
If you have questions or want to better understand how Apple’s anti-CSAM affects your phone and privacy, call us and speak to our expert technicians who are happy to help.
We off an alternative back-up solution to Apple’s iCloud and can easily save all your photos where they won’t be scanned by a third party.
With support solutions for the home and office, My Computer Works is here to help you get back to your life.
Home or office solutions—My Computer Works is here to help you get back to your life.
Subscribe Now