Apple (NASDAQ:AAPL) is said to be about to debut a new software feature that will be able to scan and detect child abuse images on the iPhones of Americans.
According to a report in the Financial Times, the features will use an algorithm that will scan photos on a person's iPhone, and which have been uploaded to their iCloud account, to look for images that are on national database of known images of child sex abuse. The images would be flagged and reported to law-enforcement authorities if there is a match with those photos in the national database.
The company has used such technology, called "hashing" for photos loaded up to iCloud, but it is believed this would be the first time Apple (AAPL) would employ hashing directly on users iPhones.
The company reportedly could roll out the image-scanning technology as early as this week.
Apple (AAPL) didn't immediately return a request for comment.
While the technology would nominally be used specifically to identify child sex abuse material, it would likely raise concerns about individual's privacy rights, especially as Apple (AAPL) has gone to extremes to tout the security and safety of information that is store on iPhones.
"Should Apple take this action, it would have to be very careful in order to not violate anyone’s privacy," said Tim Bajarin, director of technology consultancy Creative Strategies. "However, if a person publicly posts a picture of a child and it showed or hinted of child abuse, these could be scanned by anyone whose goal was to look for child abuse. Apple would have to tread lightly."