themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 2 months agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square102linkfedilinkarrow-up1606arrow-down124cross-posted to: [email protected][email protected]
arrow-up1582arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 2 months agomessage-square102linkfedilinkcross-posted to: [email protected][email protected]
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·2 months agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?