Pro@programming.dev to Technology@lemmy.worldEnglish · 1 day agoGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googleexternal-linkmessage-square16fedilinkarrow-up178arrow-down15file-textcross-posted to: technology@beehaw.org
arrow-up173arrow-down1external-linkGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googlePro@programming.dev to Technology@lemmy.worldEnglish · 1 day agomessage-square16fedilinkfile-textcross-posted to: technology@beehaw.org
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down3·13 hours agoBecaue hashes are known to work great with images 🤦♂️
minus-squaregian @lemmy.grys.itlinkfedilinkEnglisharrow-up6·13 hours agoThey say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem. The algorithm is explained here https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf it is not an hash in the cryptographic sense.
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up2·7 hours agoThere was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly
Becaue hashes are known to work great with images 🤦♂️
They say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem.
The algorithm is explained here
https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf
it is not an hash in the cryptographic sense.
There was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly