This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?
Google isn’t the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn’t mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.
This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:
i know it’s really fucked up, but the FBI needs to train an AI on CSAM if it is to be able to identify it.
i’m trying to help, i have a script that takes control of your computer and opens the folder where all your fucked up shit is downloaded it’s basically a pedo destroyer. they all just save everything to the downloads folder of their tor browser, so the script just takes control of their computer, opens tor, and pressed cmd+j to open up downloads and then it copies the files names and all that.
will it work? dude, how the fuck am i supposed to know, i don’t even do this shit for a living
i’m trying to use steganography to embed the applescript in a png
the applescript opens tor from spotlight search and presses the shortcut to open downloads
i dunno how much y’all know about applescript. it’s used to automate apps on your mac. i know y’all hate mac shit but dude, whatever, if you get osascript -e aliased to o you can run applescript easily from your terminal
This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?
Google isn’t the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn’t mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.
This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:
https://technologycoalition.org/news/the-tech-coalition-empowers-industry-to-combat-online-child-sexual-abuse-with-expanded-photodna-licensing/
Who do you think the FBI would contract to do the work anyway 😬
Maybe not Google but it would sure be some private company. Our government doesn’t do stuff itself almost ever. It hires the private sector
guess i gotta get into the private sector, lmao
i know it’s really fucked up, but the FBI needs to train an AI on CSAM if it is to be able to identify it.
i’m trying to help, i have a script that takes control of your computer and opens the folder where all your fucked up shit is downloaded it’s basically a pedo destroyer. they all just save everything to the downloads folder of their tor browser, so the script just takes control of their computer, opens tor, and pressed cmd+j to open up downloads and then it copies the files names and all that.
will it work? dude, how the fuck am i supposed to know, i don’t even do this shit for a living
i’m trying to use steganography to embed the applescript in a png
What’s the ‘applescript’?
the applescript opens tor from spotlight search and presses the shortcut to open downloads
i dunno how much y’all know about applescript. it’s used to automate apps on your mac. i know y’all hate mac shit but dude, whatever, if you get
osascript -ealiased tooyou can run applescript easily from your terminaljust pass in a heredoc
Google wants to be able to recognize and remove it. They don’t want the FBI all up in their business.
So, Google could be allowed to have the tools to collect, store, and process CSAM all over the Web without oversight?
Pretty much everyone else would get straight to jail for attempting that.