The article headline is wildly misleading, bordering on being just a straight up lie.
Google didn’t ban the developer for reporting the material, they didn’t even know he reported it, because he did so anonymously, and to a child protection org, not Google.
Google’s automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account.
Google’s only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google’s part, I find it very understandable that the appeals team generally speaking won’t accept “I didn’t know the folder I uploaded contained CSAM” as a valid ban appeal reason.
It’s also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.
Google’s only failure here was to not unban on his first or second appeal.
My experience of Google and the unban process is: it doesn’t exist, never works, doesn’t even escalate to a human evaluator in a 3rd world sweatshop - the algorithm simply ignores appeals inscrutably.
This is correct. However, many websites/newspapers/magazines/etc. love to get more clicks with sensational headlines that are technically true, but can be easily interpreted as something much more sinister/exciting. This headline is a great example of it. While you interpreted it correctly, or claim to at least, there will be many people that initially interpret it the second way you described. Me among them, admittedly. And the people deciding on the headlines are very much aware of that. Therefore, the headline can absolutely be deemed misleading, for while it is absolutely a correct statement, there are less ambiguous ways to phrase it.
They didn’t get mad, they didn’t even know THAT he reported it, and they have no reason or incentive to swipe it under the rug, because they have no connection to the data set. Did you even read my comment ?
I hate Alphabet as much as the next person, but this feels like you’re just trying to find any excuse to hate on them, even if it’s basically a made up reason.
they obviously did if they banned him for it; and if they’re training on csam and refuse to do anything about it then yeah they have a connection to it.
Google doesn’t ban for hate or feels, they ban by algorithm. The algorithms address legal responsibilities and concerns. Are the algorithms perfect? No. Are they good? Debatable. Is it possible to replace those algorithms with “thinking human beings” that do a better job? Also debatable, from a legal standpoint they’re probably much better off arguing from a position of algorithm vs human training.
Also, the data set wasn’t hosted, created, or explicitly used by Google in any way.
It was a common data set used in various academic papers on training nudity detectors.
Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…
He got banned because Google’s automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn’t even a manual decision to ban him.
His ban had literally nothing whatsoever to do with the fact that the CSAM was part of an AI training data set.
You’re right. It can be images, that’s exactly why saying “this man was found in possession of child abuse material images” does not make grammatical sense. It’s why CP still defines it better as we’re not arresting people for owning copies of Lolita, which you could argue is also CSAM.
Big “Ben Shapiro ranting about renewable energies because of the first law of thermodynamics” energy right here.
And your point is literally the opposite. Lolita could be argued to be child porn, as it’s pornographic material showing (fictional/animated) children. It is objectively NOT CSAM, because it does not contain CSA, because you can’t sexually abuse a fictional animated character.
CP is also a common acronym that can mean many other things.
Porn also implies it’s a work of artistic intent, which is just wrong for CSAM.
The majority of people can be wrong.
No they can’t, not with regards to linguistics. Linguistics is a descriptive science, not a prescriptive one. Words and language, by definition, and convention of every serious linguist in the world, mean what the majority of people think them to mean. That’s how language works.
I’m not comparing you to Ben Shapiro, I’m comparing your grammar nazi pedantism to a single specific instance of his grammar nazi pedantism.
I also gave several explicit reasons why using CP over CSAM is idiotic, not just “my friends say so”
So that’s 2 for 2 for wildly and dishonestly misrepresenting my points.
But hey, if you want to be like that sure.
You’re right, everyone else is wrong, you do you and keep using CP instead of CSAM, and keep getting irrationally upset and angry at people who think CSAM is a better term. Happy now ?
A “material image” doesn’t make any sense. An image is material. It should be CSAI if you wanna be specific.
I don’t know why this is the second time I’ve had a discussion about CSAM being a stupid acronym on Lemmy, but it’s also the only place I’ve ever seen people use it.
In the current digitized world, trivial information is accumulating every second; preserved in all it’s tritness, never fading, always accessible; rumors of petty issues, misinterpretations, slander.
All junk data preserved in an unfiltered state, growing at an alarming rate, it will only slow down social progress.
The digital society furthers human flaws and selectively rewards development of convenient half-truths. Just look at the strange juxtaposition of morality around us. Billions spent on new weapons to humanely murder other humans. Rights of criminals are given more respect than the privacy of their own victims. Although there are people in poverty, huge donations are made to protect endangered species; everyone grows up being told what to do.
“Be nice to other people.”
“But beat out the competition.”
“You’re special, believe in yourself and you will succeed”.
But it’s obvious from the start that only a few can succeed.
You exercise your right to freedom and this is the result. All the rhetoric to avoid conflict and protect each other from hurt. The untested truths spun by different interests continue to churn and accumulate in the sandbox of political correctness and value systems.
Everyone withdrawals into their own small gated community, afraid of a larger forum; they stay inside their little ponds leaking what ever “truth” suits them into the growing cesspool of society at large.
The different cardinal truths neither clash nor mesh, no one is invalidated but no one is right. Not even natural selection can take place here.
The world is being engulfed in “Truth”. And this is the way the world ends. Not with a BANG, but with a…
The article headline is wildly misleading, bordering on being just a straight up lie.
Google didn’t ban the developer for reporting the material, they didn’t even know he reported it, because he did so anonymously, and to a child protection org, not Google.
Google’s automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account.
Google’s only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google’s part, I find it very understandable that the appeals team generally speaking won’t accept “I didn’t know the folder I uploaded contained CSAM” as a valid ban appeal reason.
It’s also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.
My experience of Google and the unban process is: it doesn’t exist, never works, doesn’t even escalate to a human evaluator in a 3rd world sweatshop - the algorithm simply ignores appeals inscrutably.
I’m being a bit extra but…
Your statement:
The article headline:
The general story in reference to the headline:
The article headline is accurate if you interpret it as
“A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It” (“it” being “csam”).
The article headline is inaccurate if you interpret it as
“A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It” (“it” being “reporting csam”).
I read it as the former, because the action of reporting isn’t listed in the headline at all.
___
This is correct. However, many websites/newspapers/magazines/etc. love to get more clicks with sensational headlines that are technically true, but can be easily interpreted as something much more sinister/exciting. This headline is a great example of it. While you interpreted it correctly, or claim to at least, there will be many people that initially interpret it the second way you described. Me among them, admittedly. And the people deciding on the headlines are very much aware of that. Therefore, the headline can absolutely be deemed misleading, for while it is absolutely a correct statement, there are less ambiguous ways to phrase it.
This is pretty much the art of sensational journalism, popular song lyric writing and every other “writing for the masses” job out there.
Factual / accurate journalism? More noble, but less compensated.
It is a terrible headline. It can be debated whether it’s intentionally misleading, but if the debate is even possible then the writing is awful.
Awfully well compensated in terms of advertising views as compared with “good” writing.
Capitalism in the “free content market” at work.
A 404Media headline? The place exclusively staffed by former BuzzFeed/Cracked employees? Noooo, couldn’t be.
so they got mad because he reported it to an agency that actually fights csam instead of them so they can sweep it under the rug?
They didn’t get mad, they didn’t even know THAT he reported it, and they have no reason or incentive to swipe it under the rug, because they have no connection to the data set. Did you even read my comment ?
I hate Alphabet as much as the next person, but this feels like you’re just trying to find any excuse to hate on them, even if it’s basically a made up reason.
they obviously did if they banned him for it; and if they’re training on csam and refuse to do anything about it then yeah they have a connection to it.
Google doesn’t ban for hate or feels, they ban by algorithm. The algorithms address legal responsibilities and concerns. Are the algorithms perfect? No. Are they good? Debatable. Is it possible to replace those algorithms with “thinking human beings” that do a better job? Also debatable, from a legal standpoint they’re probably much better off arguing from a position of algorithm vs human training.
Also, the data set wasn’t hosted, created, or explicitly used by Google in any way.
It was a common data set used in various academic papers on training nudity detectors.
Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…
So you didn’t read my comment then did you ?
He got banned because Google’s automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn’t even a manual decision to ban him.
His ban had literally nothing whatsoever to do with the fact that the CSAM was part of an AI training data set.
ATM machine
Which of the letters in CSAM stand for images then ?
Material.
Material can be anything. It can be images, videos theoretically even audio recordings.
Images is a relevant and sensible distinction. And judging by the downvotes you’re collecting, the majority of people disagree with you.
And, if you’re trying to authorize law enforcement to arrest and prosecute, you want the broadest definitions possible.
You’re right. It can be images, that’s exactly why saying “this man was found in possession of child abuse material images” does not make grammatical sense. It’s why CP still defines it better as we’re not arresting people for owning copies of Lolita, which you could argue is also CSAM.
The majority of people can be wrong.
Big “Ben Shapiro ranting about renewable energies because of the first law of thermodynamics” energy right here.
And your point is literally the opposite. Lolita could be argued to be child porn, as it’s pornographic material showing (fictional/animated) children. It is objectively NOT CSAM, because it does not contain CSA, because you can’t sexually abuse a fictional animated character.
CP is also a common acronym that can mean many other things.
Porn also implies it’s a work of artistic intent, which is just wrong for CSAM.
No they can’t, not with regards to linguistics. Linguistics is a descriptive science, not a prescriptive one. Words and language, by definition, and convention of every serious linguist in the world, mean what the majority of people think them to mean. That’s how language works.
“I’m mad you’re right so let me compare you to a hateful right wing grifter and also by the way, you’re wrong because all my friends say so.”
It may shock you but a handful of Lemmy users doesn’t constitute the linguistic consensus you’re trying to inherit here.
I’m not comparing you to Ben Shapiro, I’m comparing your grammar nazi pedantism to a single specific instance of his grammar nazi pedantism.
I also gave several explicit reasons why using CP over CSAM is idiotic, not just “my friends say so”
So that’s 2 for 2 for wildly and dishonestly misrepresenting my points.
But hey, if you want to be like that sure.
You’re right, everyone else is wrong, you do you and keep using CP instead of CSAM, and keep getting irrationally upset and angry at people who think CSAM is a better term. Happy now ?
CSAM stands for “material”. Adding “image” specifies what kind of material it is.
A “material image” doesn’t make any sense. An image is material. It should be CSAI if you wanna be specific.
I don’t know why this is the second time I’ve had a discussion about CSAM being a stupid acronym on Lemmy, but it’s also the only place I’ve ever seen people use it.
Material. Type of material: Image
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
We need to block access to the web to certain known actors and tie ipaddresses to IDs, names, passport number. For the children.
Also, pay me exhorbitant amounts of tax-payer money to ineffectually enforce this. For the children.
Oh hell no. That’s a privacy nightmare to he abused like hell.
Also that wouldn’t work at all what you say.
In the current digitized world, trivial information is accumulating every second; preserved in all it’s tritness, never fading, always accessible; rumors of petty issues, misinterpretations, slander.
All junk data preserved in an unfiltered state, growing at an alarming rate, it will only slow down social progress.
The digital society furthers human flaws and selectively rewards development of convenient half-truths. Just look at the strange juxtaposition of morality around us. Billions spent on new weapons to humanely murder other humans. Rights of criminals are given more respect than the privacy of their own victims. Although there are people in poverty, huge donations are made to protect endangered species; everyone grows up being told what to do.
“Be nice to other people.”
“But beat out the competition.”
“You’re special, believe in yourself and you will succeed”.
But it’s obvious from the start that only a few can succeed.
You exercise your right to freedom and this is the result. All the rhetoric to avoid conflict and protect each other from hurt. The untested truths spun by different interests continue to churn and accumulate in the sandbox of political correctness and value systems.
Everyone withdrawals into their own small gated community, afraid of a larger forum; they stay inside their little ponds leaking what ever “truth” suits them into the growing cesspool of society at large.
The different cardinal truths neither clash nor mesh, no one is invalidated but no one is right. Not even natural selection can take place here.
The world is being engulfed in “Truth”. And this is the way the world ends. Not with a BANG, but with a…
Is this a fresh new copypasta, or are you just a really long-winded, elaborate troll?
Fuck you, and everything you stand for.
That sounds like sarcasm to me
People on Lemmy don’t understand sarcasm because they have brain damage.
including you
No need to go that far. If we just require one valid photo ID for TikTok, the children will finally be safe.