themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 day agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square93fedilinkarrow-up1533arrow-down117cross-posted to: technology@lemmy.zip
arrow-up1516arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 day agomessage-square93fedilinkcross-posted to: technology@lemmy.zip
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·6 hours agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?