Apple has faced severe criticisms over its detecting images of child sex abuse and now it has deviced a new system which will hunt only for pictures that have been flagged by clearinghouses in multiple countries.
This major shift has come after Apple had to reassure privacy advocates that there won’t be any breach in privacy.
After previously declining to say how many matched images on a phone or computer it would take before the operating system notifies Apple for a human review and possible reporting to authorities, executives said on Friday it would start with 30, though the number could become lower over time as the system improves.
Apple further added that it would be very easy for researchers to ensure that the list of image identifiers being sought on one iPhone was the same as the lists on all other phones, seeking to blunt concerns that the new mechanism could be used to target individuals.
Read latest news on business strategy e magazine ! latest entrepreneurial story on Bizemag podcast
The company published a long paper revealing how it has dealt with potential attacks on the system and defended against them.
Apple blamed its communication for triggering backlash from influential technology policy groups and even its own employees concerned that the company was jeopardizing its reputation for protecting consumer privacy.
It declined to say whether that criticism had changed any of the policies or software, but said that the project was still in development and changes were to be expected.
The rolling series of explanations, each giving more details that make the plan seem less hostile to privacy, convinced some of the company`s critics that their voices were forcing real change.
“Our pushing is having an effect,” tweeted Riana Pfefferkorn, an encryption and surveillance researcher at Stanford University.