[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.
  • You may highlight syntax and preserve whitespace by using [code] tags.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


Nothing could possibly go wrong with this.
>>
>>103258637
That sounds very illegal.
>>
>>103258637
... but does it know the names of all the actors / actresses and their partners? Can it win pedobingo?
>>
File: skeleglasses.jpg (26 KB, 500x500)
26 KB
26 KB JPG
>>103258637
who gave them permission to do that?
did they get a signed release from every victim?
>>
>>103258637
>trained on REAL CHILD SEX ABUSE IMAGES
GEE JOE, WHERE DID THEY GET THOSE IMAGES, WHO VERIFIED EVERY SINGLE ONE OF THEM AND TAGGED THEM HMM??
I fucking love how many questions this single headline throws up, not just legally but also the morality of it all.
>>
>>103258665
It's not illegal when the government does it. /s
>>
>>103259003
it isn't the government though
>>
>>103259003
>Gov Recruiter: How can we help you today?
>*chan: I want to watch cp all day and categorize it.
>Gov Recruiter: *long pause* Let me put you on hold ...
>>
>>103259027
>>103259015
these except unironically
>>
>>103258637
> Listen, and understand! That Diddler is out there! It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are diddled.
>>
>>103258637
Two weeks later:
>granny is accussed by anti-pedo AI of producing CP (it was a photo of her baby grandchild in the tub)
>anti-pedo AI is leaked and used to produce CP on the dark web
>anti-pedo AI is shut down after it targeted (((somebody important)))
>>
Very fun watching pedos seethe and just inventing scenarios in their mind because they didn't actually bother looking up what this is lol
>>
Very fun pretending to be a contrarian redditor to get cheap (You)'s on 4chan org
>>
Throw this AI in the woodchipper along with all the pedos
>>
>>103259329
>anti-pedo AI is shut down after it targeted (((somebody important)))
So how many silk road derivatives have we been through already?
>>
Put this AI on a woodchipper along with all the redditors
>>
>>103259724
t. boomer who sends his daughter to college
>>
>>103259329
>>anti-pedo AI is leaked and used to produce CP on the dark web
I doubt a neural network trained for image recognition could be repurposed for image generation
>>
>>103260371
No but somehow it could be tricked to give the hash/file names/URLs of the content.
>>
File: list.jpg (47 KB, 680x505)
47 KB
47 KB JPG
>>103259003
True and real. They ran the largest online distribution network and only cut it down because it was becoming too big to fail. And don't forget that 30 years ago they were the largest producer and distributor of VHS content of that type.
>>
>>103258704
They need it to... to work from home or something.
>>
>>103258637
golly gee goobermint, how did you train it on REAL CP images? hmmm? is it ok to have those images if this is the reason?
>>
>>103258637
What for? Why do we need AI to recognize these? Shouldn't they try to keep children from being abused instead of caring about those who already were?
If the goal is to make AI able to recognize any new CP as well, it has been for over ten years, it's even integrated inside many software and OS.
>>
>>103260371
I could be wrong but according to what I was told that is the very way it works. The first part generates random shit, the second part says how much it looks like what it was trained on. Then the first part generates some other random shit, the second part says if it's better or worse than the first shit. And it goes on, the second part guiding the first one until its random shit looks very much like what it was trained on, just like when we were kids and we hid some object and told our friend "You're getting closer" until the friend found the object.
>>
fine tuned model where
>>
This is worse than the time they exiled that pedophile into space with a child on board.
>>
>>103260928
Worse?
>>
>>103258637
So is this a means to automatically flag illegal content the glowniggers post so you don't have to sign up for Microsoft's and other glownigger services?
>>
>>103260928
That sounds so much like a Peter Griffin joke.
>>
>>103260928
>>103260991
>>103261114
https://www.youtube.com/watch?v=SRRw1ERj2Gc
>>
>>103261114
I think it's an Onion sketch but I can't find the clip. It might have been someone else parodying BBC news.
>>
>>103261163
Beat me to it >>103261158
>blocked in your country
that explains why I couldn't find it.
>>
>>103261169
https://files.catbox.moe/iekfq3.mp4
here you go, anon
>>
>>103261202
My man! I didn't even have to ask.
Thanks Anon.
>>
>>103261214
<3
>>
>>103261202
Is this the onion?
>>
>>103261214
same here
>>
>>103261222
brass eye - it's a bri'ish show
>>
File: 1732223981398.jpg (44 KB, 1024x576)
44 KB
44 KB JPG
>look, I need to watch all this CP during work hours and meticulously organize and tag it to... Uhhh... Protect the children. You want to protect them, don't you? Are you some kinda pedo? I though so. Now fetch me more papers towels and lube
True story.
>>103258974
It was me. Sorry.
>>
File: Synonymous.png (39 KB, 588x146)
39 KB
39 KB PNG
Parody has once again become reality. Poe's law in full effect.
>>
>>103261229
https://files.catbox.moe/l54k64.mp4
>>
File: file.png (144 KB, 826x773)
144 KB
144 KB PNG
>>103258637
POST
THE
FUCKING
LINK
https://safer.io/about/
>>
>>103261328
OK, but what if I don't want Safer? What if I want normies to be flooded with pizza?
>>
File: ai.png (342 KB, 1068x308)
342 KB
342 KB PNG
that's not the actual headline you idiots
>>
>>103261515
That means the same thing, and they obviously just changed it. You sir are the idiot!
>>
>>103261515
Do you want to give me a boner?
>>
>>103261459
Insane that it's been over a decade since this game came out and there is still practically 0 2d lolige made in the west. And it's been 3 decades since loli was popularized in the west, still no relevant lolicon artists, just endless waves of rule34 autism
>>
>>103258637
can I get this dataset so I can train my own model to defeat pdf files?
>>
>>103258665
It's not illegal when (((NGOs))) do it. Not the first time either. Just search for "saucenao canada" and you will immediately find it.
>>
>>103261515
For she's fully dressed anyway, why did they illustrate the article with a pixelated picture of a clearly 18+ woman instead of a child? That makes no sense. (And if you ask how I know she's not a kid or a teenager, it's from the length of the legs and arms compared to the torso, the size of the feet compared to the length of the legs, the size of the head compared to the rest of the body, etc.)
Also
>>will make it harder to spread CSAM online
The only persons in the world who would want to spread this shit online are the feds, for their honeypots. Pedos will never want to 'spread' it, they'll want to 'find' it, big difference. It's like saying drug addicts spread drugs all around.
>>
>>103261743
Excuse me sir that <35 year old child is not for you to be looking at with your evil gaze, back off and stop asking questions before I call the cyber police.
>>
>>103261743
>The only persons in the world who would want to spread this shit online are the feds
Wrong. I'm not a fed, but I really like to watch NPCs seething.
>>
>>103261743
>Pedos will never want to 'spread' it, they'll want to 'find' it
exactly, otherwise we'd see much more cp on the interwebs. or they DO spread it, in which case the threat is ridiculously small.
what's the threat, anyway? you're not going to catch any people producing that shit by... removing an image/video a fed/hillbilly posted on a basket weaving forum...
oh well, at least we know that someone positively has a large enough set of cp to train a model (they certainly wouldn't spread any of those to bait people)
>>
>>103261733
That makes me wonder how they get these "millions" of photos and videos. Surely they don't go getting them from the internet or darknet, they get them from "officials". So imagine them politely asking: "Hey, gov, I need a collection of millions of disgusting photos and videos of kids being raped. Surely you have this, don't you? PS: By the way that's for doing something right, I promess. I just need to spend my days fighting this thing, while not being part of any police force, for free, like a hero, for, er, reasons."
>>
File: 1678408505823215.jpg (194 KB, 736x981)
194 KB
194 KB JPG
>>103258637
And then one day, for no reason at all...
>>
>>103258637
too old
>>
>>103261889
ironically there has never been a government with more pedophiles. They basically did state mandated pedophillia at one point.
>>
>>103261788
I came across it a few times on random porn websites and these were obvious honeypots. On the thumbnails of the videos there was a big text like "if you want more, contact us at something@gmail.com or telegram". These were deleted by the website a few minutes after being uploaded. I remember thinking "I doubt perverts are stupid enough to fall for this... A gmail address, come on!"
>>
>>103261943
Nazis? I've read that on the contrary the books they burned were only pedo books and other degenerate shit that (((some people whose name always has echo))) were publising.
>>
>>103262037
I assume he's talking about the US government
>>
File: max fag kike.png (72 KB, 437x349)
72 KB
72 KB PNG
>>103262037
it was (((Max Hirschfield))) and his tranny clinic.
This was the degeneracy that lead to the book burnings and the pink triangles going into the camps with the jews
>>
File: 1669469214763258.jpg (15 KB, 369x308)
15 KB
15 KB JPG
>>103261943
I wonder where you get your sources
>>
So a corporation is allowed to have a mega database of pizza? That sounds illegal no matter how they try to spin it. Feds don't go light on that stuff.
>>
>>103262252
they were probably contracted by one of those agencies to do it.
>>
>>103261515
OP here, the headline was the one in my screenshot at the time I took it.
Proof someone at Arstechnica reads this board?
>>
>>103262605
Don't know about Arstechnica but some glowies are here, sure.
>>
>>103259329
>>granny is accussed by anti-pedo AI of producing CP (it was a photo of her baby grandchild in the tub)
this has already happened with google filters and Apple will deleted personal pictures if they think is cp
>>
>>103259329
Unironically two weeks later:

>in the vein of the llama and miqu leaks, it ends up on /lmg/ mysteriously one morning (AM EST time)
>>
>>103259003
>/s
rediit
>>
>>103259003
>/s
kys nigger
>>
>>103258637
don't care how it came about, this sounds super useful for automodding imageboards where feds and pedos scouting for a new place to crash do drive-by dumping of cp.
>nooo my free peach! I must share my ai generated cp on your board!!
not my probldm + anything that pisses off aifags is a plus in my book.
>>
>>103259329
>>anti-pedo AI is leaked and used to produce CP on the dark web
Not how computer vision models work
And there are probably generative models for that sort of thing out there already
>>
>>103260832
That's called reinforced learning but the end result is still a singular model. It's a black box that can't be reversed
>>
>>103259003
>/s
please die immediately.
>>
>>103259003
>/s
Kill yourself.
>>
File: misi.png (141 KB, 480x360)
141 KB
141 KB PNG
>>103258637
>>
>>103260832
this is how a GAN works
it's totally possible to just train a classifier without a generator
but in a GAN, the generator needs the classifier
>>
>>103261711
>>103261459
that shit hits way too close to home for 2d chads so most avoid it even if the art is good, maybe if the artist comes back and does a fantasy gb isekai thing I'd be interested.
>>
>>103260371
There is a type of generative model known as a Generative Adversarial Network which consists of two types of neural networks. One of these networks is called a discriminator, and its job is to classify images as belonging to the target class, or not. The other is called a generator, and its job is to produce images of the target class. These types of models are designed to be trained in tandem, so that the generator learns to fool the discriminator, and the discriminator learns to get better at telling the fakes.

While it's generally not recommended to start with a perfect discriminator, I do believe there should be some variant GANs out there that would enable a generator to learn the types of images that a perfect discriminator will classify correctly. In fact, I know I've seen papers that show that adversarial training against a black box classifier is possible.
>>
>>103258637
Cant wait for another Microsoft update that will have this implemented to work with Recall.
>>
>>103265372
also the error gradient of the discriminator is passed to the generator
i.e. the generator gets to peek at how the other guy told the difference
picrel
t. implemented one from scratch 5 years ago
>>
>>103265547
I'm actually currently working on a research project involving comparing different generative models for a particular task, and the one I'm currently in the process of developing and optimizing... is a GAN. That figure is an accurate overview of the training process.

Also, part of me wants to guess that it was made in draw.io, which is a tool I use for my own figures, and also tends to have similar... default color schemes.
>>
Cstaber
>>
>>103265567
i stole it from here:
https://developers.google.com/machine-learning/gan/generator
you'd be surprised how many people omit the generator peeking at the discriminator loss in articles or videos
it's like everyone is copy-pasting the same quick overview that happened to leave it out
absolutely catastrophic loss of knowledge
also go in any machine learning d*scord server and nobody can implement anything from scratch
(they only know how to use some shitty library that abstracts over it all)
>>
>>103265640
>they only know how to use some shitty library that abstracts over it all
Are you talking about using something like PyTorch or Keras, which everyone is using, or are you suggesting they're using some pre-built GAN library?
>>
>>103265683
pytorch, keras, tensorflow, etc.
>>
>>103258974
>>103259329
False positives will be a real problem. I cant imagine a way that the system could be trained to have low enough false positives to be feasible. How in the world could a system like this be trained accurately enough? I listened to a talk about training ADAS visual systems in a car. The thinking was that to correct a false positive of the car seeing the moon and identifying it as a stop sign, would take 1 million images of the moon from that spot. And that was for a single viewing of the moon from a specific spot. If you change either the position the moon or the car, you need another 1 million images. I would imagine something similar would needed to get the false positive rate low enough to actually deploy some kind of auto-detecting image system
>>
>>103265700
These libraries don't abstract so much that you could implement a GAN without knowing how they work. At the level of "knowing that the generator is peeking at the discriminator's loss", you are describing the training step, which has to be supplied by the programmer using these libraries. PyTorch and the like don't know what a GAN is, but they do provide the building blocks. You don't have to write code to manually do all of the linear algebra involved in each of the individual layers, but you do need to specify what the layers are, and what a forward pass through the layers looks like. You also need to tell it what the loss function is, and in the case of a GAN, how that loss function is applied during training.
>>
>>103265725
If you have a false positive rate of 1 in 10,000, which is pretty damn impressive, you're still going to get several false positives per day if you deploy it on a large site.
>>
>>103265725
God you're retarded. Subject and context matters and you can't compare human identifying images to self-driving cars
>>103258704
Ever heard of evidence seizure and submission?
>>
>>103260755
>What for?
Read the article, retard.
>>
>>103259003
gemmy
>>
>>103265725
False positives are fine when its about content hosting. It just means that one user will not be able to upload some image every now and then.

They aren't using those to automatically jail pedos, they are using them as content filter.
>>
>>103265808
right but the implementation on the part of the programmer becomes a task of copy-pasting the relevant - what i would call boilerplate - code to drive the library
i'd wager most people who have implemented a GAN with an abstract library don't pay attention to the specifics of how the loss flows through the system
>>
>>103259003
Unironically 99% of pedo websites on the clearnet, and 66% of pedo websites on the darknet are run by governments.
Western governments are unambiguously the largest distributor of child sexual abuse material, it's not even a competition
>>
>anons are outraged because they're guessing what the article is about instead of just reading it
Why am I not surprised
>>
>>103266117
It's been that way for a while. Lots of the current generation do not like to read and have little endurance for reading for longer periods of time. Doesn't help though that OP did not actually link the article, so you'd need to google search to read it.
>>
>>103260531
>30 years ago they were the largest producer and distributor of VHS content of that type
Wat. Nigga I'm gonna need some sauce on that.
>>
>>103258704
Stop asking questions goy
>>
>>103259003
>/s
blow your brains out
>>
>>103263820
There is a gigantic collection of hashes for pretty much all known CP and the feds have it. Using that to auto moderate would be the logical choice.

Using AI is retarded. Even if it has a 1 % rate of false positives, it means that for every 1000 posts, 100 will be falsely flagged. On social media with millions of posts per hour that would mean about 1000 false positives each hour, and they're not going to manually check, they're gonna go the YouTube/Amazon road and block legitimate users and have them go through loops for weeks/months to finally get a big "fuck you" and have their accounts blocked permanently.

That is if they don't needlessly add them to a list or call the cops to begin with.
>>
>>103266821
There are 3 kinds of mathematicians, those that can count and those that can't.

But on a more serious note, one need not have any pre-existing CSAM to detect kiddos if AI were really AI. There are easy to identify characteristics that will not exist on a normal adult. That is where the false positives come in but they would be exceedingly rare if the AI is really AI.
>>
>>103266018
It's worse than that. 99% of websites on the clear net distribute csam, no I won't provide a source.
>>
>>103258637
For what purpose? Assume I don't know they want to use this for their own pleasure.
What is the "clear net" reason they give for having this?
>>
>>103258637
>>103259003
>>103259015
Yeah it's child trafficking orgs masquerading as "child safety" NGOs. It's widely known that they traffic more children across borders incognito than anyone else on earth.
>>
>>103259003
>/s
fag
>>
>>103258637
>real
fictional children cause pain too
>>
>>103261849
Why do you think they have so many tissues in the room? Probably because they're crying.. yeah..
>>
>>103266018
Meta and all its products (Facebook, Insta, WA, etc) are the number one source of CP worlwide. Regardless of clear, dark, altnet origin nothing comes close to them other than the FBI which is number two.
>>
>>103260371
They actually can. You can generally brute force them into reproducing content that they were trained on. It's not perfect and it's still statistical but you can do it.
>>
File: 1705964104891443.png (167 KB, 492x597)
167 KB
167 KB PNG
>>103266655
>>
>>103258637
I was expecting this. I'm sure some glowie contractor will abuse this to make his own gooning material, but that's the least of my concerns. Question is how many false positives will this flag without any manual review leading to people getting reported or losing access to services.
>>
>>103258704
shush now
>>
>>103258704
>did they get a signed release from every victim?
Thats the real reason its illegal to copyright cp. Otherwise victims could be sue feds for using their content
>>
Whats the supposedly "legit" use of this?
>>
>>103274024
>legit
>>
>>103274024
scan massive amounts of data, coom to the positive results
>>
>>103258637
This is a perfectly sane thing to do.
>>
>>103274024
You're asking too many questions.
>>
>>103265113
Steamed hams indeed.
>>
>>103258637
So they are the real pedos...
>>
>>103274024
Automating moderation, obviously.
>>
>>103269862
I remember when twitter (before elon musk) refused to take down cp when the child in the video reported it. They even reviewed the video and said no.
>>
>>103258637
link? for research purposes
>>
>>103278238
Based.
Hopefully Musk will continue the trend and dismantle law enforcement so hard they'll wish for leftist defunding instead.
>>
>>103261515
>>103258637
This poses danger but not in the way /g/tards think it does, you could train a counter model that aims to generate pics that gives the biggest positive in the classification model and then you'd have a CP GAN which would produce the CPest images ever
>>
File: ezgif-1-1fc3136d8d.gif (3.05 MB, 600x371)
3.05 MB
3.05 MB GIF
>>103258665
>>103258704
>>103258974
>>103259003
>>103259027
I know you guys are retarded brown /g/tards that pop up any time the government and CP are mentioned, but let me break down what the FBI/NSA/HLS actually do with their CP databases to stop crime.
Every time you hear of a CP bust in the news or "Man found with 3 terabytes of CP", you do realize SOMEONE has to look at it, inspect it, document it, and store it securely. They have trained, vetted, veteran professionals authorized to handle CSAM in forensics for court cases. Each video or picture gets hashed and stored in a file on some FBI server. These hashes are compiled from image scrapers and harddrives from CP busts to create a catalogue. Once an image is hashed, it doesn't have to be looked at again, so it is stored on an extremely secure crypto server that only a few people have access to. Juan Illegal the DEI hire never sees these and probably doesn't even know about them. Mr. 30 years on the force VIP Detective does not jack off to this CP database. They are not sold to Mr. Elected Official Pedophile in the white house.

The purpose of these hashes is to make sure when a bust happens or a new Lolihaven.rs site on the deep web pops up, they know which images are old because it's automatically indexed in their hash database. Mr. Pizzapedo from Twitter gets busted for 500 CSAM images after grandstanding that Pizzagate was debunked, and all 500 images he downloaded were already in the FBI database. This shows he's just a pervert and not a distributor, so he gets an easier sentence.
Now, the ringleader of a massive child trafficking ring gets busted with 3tb of freshly made videos that aren't already hashed? 10 gorillion life sentences, no parole. His videos are then saved, hashed, and any time in the future they are downloaded and picked up by the image scrapers, FBI will instantly know where they came from, who he sold them to, and connections are made in that CSAM ring, where busts will continue.
>>
File: 1518041619881.jpg (215 KB, 518x617)
215 KB
215 KB JPG
>>103282206
Cont.

In cases like OP's article, where a firm wants to "borrow" CSAM images to train a model, it most likely goes through months and years of vetting processes, and a CSAM handler will be there at all times. The FBI is not literally handing over drives of CP to a random firm. All training will be conducted on approved sites with approved authorized personnel, and minimal images will actually be viewed. They are not pedophiles exchanging trillions of images for funsies. This is a legit model that will help track down and compile CSAM in massive amounts.
Trained CSAM handlers are not allowed to work on cases for very long. They go to lots of therapy. They are extremely damaged and hardened people due to the sensitivity of their work.
When the FBI receives a case, the person going through the drives doesn't work blindfolded. He needs to view each image to create a case against the perp. EnCase or whatever proprietary forensic tool they use probably doesn't show the full images, it just compares against the hash database.

The use of darknet honeypots is also legit, and not the FBI making a quick buck off CP. It's not like the CIA crack epidemic or cartel business. It is used to catalogue who creates, and who downloads, these images from fake sites. Yes, you have to be a monkey retard to fall for a CIA honeypot when you download CP, but there are millions of pedos out there and they fall for it constantly. They get a slap on the wrist if they're not distributors or creators.

I can go further in depth but I'm not an industry pro. Just a guy in digital forensics that has read several books on tales and cases of CP from 50+ year career veterans, because it's the field I wanted to go into years ago. I have a strong sense of justice and wanted to direct this towards abusers of children.

The anti-CP divisions of FBI/CIA/NSA of olde are not your enemy. They have been co-opted in recent times to protect politicians and are corrupted by their department heads.
>>
File: 1639922134561.jpg (151 KB, 638x717)
151 KB
151 KB JPG
>>103282206
>>103282303
holy mothet of pasta!
>>
>>103282303
>Trained CSAM handlers are not allowed to work on cases for very long. They go to lots of therapy. They are extremely damaged and hardened people due to the sensitivity of their work.
Is CSAM another word for gore?
>>
>>103282426
Child sexual abuse material. It's the zoomer way of saying CP but it encompasses more.
>>
>>103282206
>>103282303
you're either a gullible retard or a lying fed
>>
>>103282206
What a load of horsecrap, from the idea that the FBI is actually competent to the illusion that production of cunny isn't its own charge.
>>
>>>>103282303
what kind of sick fuck would go into the office and look at cp all day while collecting a salary?
that's actually deranged, but i would expect nothing less from spooks
>>
>>103282758
You don't even acknowledge it as important work someone has to do?
>>
>>103282758
Mate... i said it before... who do you think charges perps in CP cases? A dog? A blind person? Don't be retarded for 2 goddamn seconds. That's like being upset that morticians and autopsy people exist. Who would want to play with blood and guts and be around dead bodies all day? Sick fucks.
>>
>watch porn all day and get paid for it
Cool.
>it involves children
So?
>>
>>103280398
>produce the CPest images ever
This sentence would be so funny if it wasnt so fucked
>>
>>103282872
It's not though. They are virtual children. There is no fucking happening.
>>
>>103282844
>>103282858
>ughhh, you don't get it, i HAVE to run cp honeypots on the dark net and stare a cp all day to catch le bad guys
>>
>>103282754
https://www.scribd.com/document/723937237/Criminal-Investigation-the-Art-and-the-Science-9e-Michael-Lyman
https://www.scribd.com/document/566610437/Cybercrime-Investigations-a-Comprehensive-Resource-for-Every
Here are 2 sources with chapters on CSAM and forensics. I can't find the original book I read, but i have others downloaded. You won't care to read them though
>>
>>103282929
>doubling down on government lies
Don't care, I will always vote not guilty.
>>
>>103282908
Those children that are victims of trafficking have homes and families. Identifying the children can help lead to busts. Someone has to do the identifying and cataloguing. If we didn't have specialists working with this material, we wouldn't have any anti-CP departments. It would all get blind-eyed and be even more rampant. What do you suggest exactly? How do you fight against something you're not allowed to look at? That's like wanting to go to war with no guns because the other side are bad guys with guns so if we use guns we'd be just as bad as them.
>>
>>103282968
>Those children that are victims
Stopped reading.
>>
>>103282886
The case against AI depictions of children makes sense. If you make a 1:1 AI depiction of a child based on CSAM, it's still the child. It was based on an abused child. Just because it's a "robot drawing" doesn't put you in the clear. This is what the courts ruled regarding AI regulations, any AI depictions of known CSAM images will be prosecuted justly.
>>
>>103283000
>The case against AI depictions of children makes sense
Also stopped reading.
>>
>>103282968
>That's like wanting to go to war with no guns because the other side are bad guys with guns so if we use guns we'd be just as bad as them.
you were this close to reaching the right conclusion...
>>
>>103283004
>pdf boots up offline StableDiffusion fork
>loads image of CP
>"AI, give this girl blonde hair and make 20 variations in different poses"
>hehe I found the loophole, legal CP for everyone! no children were harmed in the making!
This is the argument for AI regulations. Telling chatGPT to generate a loli is not explicitly illegal. Using a fork to generate loli based on CSAM it was trained on, is illegal. You will be prosecuted for the material used to train, and ownership of the images it generates.
>>
>>103283000
>prosecuted justly.
stopped reading there. what a load of bullshit. it is known that the police blows it out of proportion and even add their own material to the list of content found with the suspect in order to increase the sentence
>>
>>103282968
>Those children that are victims of trafficking have homes and families
they don't
no one cares about those children
>>
>>103283043
Good thing Mark Hasse proved that prosecutors are just as mortal as the rest of us.
>>
File: 1000037661.jpg (11 KB, 232x217)
11 KB
11 KB JPG
>>103282303
we should make a database of flowing posts like this so you guys would stop flagging yourselves and learn a little bit
>>
>>103282968
What the fuck are these replies? People attack CP countermeasures by going balls deep into ironic shitposting to the degree it seems like honesty.
>>
Are children tried as adults when they post CSAM to chan boards?
>>
>>103283247
there are past cases of children being prosecuted for producing content of themselves
>>
>>103283300
Do they have to stand in the corner, do jumping jacks, write "I will not post my flat chest on boards again" on a dry erase board 1000 times?
>>
>>103283329
It's pretty much always boys who get charged with that. Some of them even get charged as an adult, if they were like 17 when they took the pictures. Which is the ultimate irony -- and they pay the price by being permanently put on the sex offender registry because they took a dick pic before 18.
>>
>>103283329
they are detained in a facility away from the public eye until they're not lust provoking anymore, then reintegrated into society
>>
>>103283343
I was thinking more like actual children, not the legal definition of children. Kiddos these days can use their parents iPad to visit sites and post their crotchless diapers. Does the judge smash a barbie in front of them?
>>
>>103283376
So they must have a panel of kiddolusters to take a vote I assume.
>>
File: pbp.png (690 KB, 1799x1566)
690 KB
690 KB PNG
>>103266655
They used to put ads on the most popular porn publications. The ad was extremely suggestive but the magazine featured in the ad was legal. However that magazine in turn had other ad, this time for the bait. The target had to provide his private information to receive one of those for free. With that the deal with the devil was done, and the target had to obey or else.
>>
>>103283398
they have a device made out of sophisticated computer users called a kidar, and they measure the frequency and intensity of "uoh"s to determine if the subject is lust provoking or not based on a threshold
>>
>>103283444
That sounds very sophisticated. I assume this was headed up by a descendant of Dr. Kraft von Ebbing? Or is there perhaps a new expert in this field?
>>
>>103283382
I don't know what the other poster knows, but it has come to my knowledge through unspecified means that especially girls really like filming themselves naked, and they are treated, legally, like any other producer of so called CSAM.
which is indicative they don't really care about children "safety"
>>
>>103283480
he seems to be known exclusively by his initials as Dr. T.o.T., not much else is shared by the government officials in charge of the program
>>
>>103283492
Laws were written like that to protect politicians from their victims.
>>
>>103283516
Roger that.. Well it sounds like he must be the ace of the program, hence the need for the anonymizing efforts.
>>
The smell of my crotchless diapers ran everyone off.
>>
>>103283218
>ironic
you give this place too much credit
>>
>>103283218
Maybe, just maybe, people would be more open to """""""""anti""""""""" CP measures if they actually did anything to stop the spread of CP (they don't) or lead to arrests of the producers (they don't).
As it is right now it's in the best case questionable as fuck with no evident upsides to anyone (especially not the people actually sifting through this shit that aren't pedos) and in reality is just used by the terminally online to take sites down they don't like by spamming CP and then crying about it to the hoster.
>>
>>103284210
Also the ethics of using REAL abuse material (often against the explicit wishes of the victims) instead of AI generated shit to """""""""""catch""""""""""" le bad guys are beyond fucked up even considering the usual mental gymnastics people pull when it comes to why we """""""""""""""""""""""""""""""""""""""""need""""""""""""""""""""""""""""""""""""""""" to do this shit.
>>
File: 1727063736583523.jpg (108 KB, 1198x1227)
108 KB
108 KB JPG
>>103283218
>ironic
>>
File: ArabianPengerRide.gif (1 KB, 32x32)
1 KB
1 KB GIF
>>103258637
ko
>>
AI trained on 'p is known as a GOON not a GAN
>>
File: peped.gif (49 KB, 800x942)
49 KB
49 KB GIF
>>103283218
You will be boiled alive just for fun, it will be great for us, and is coming sooner than you think.
>>
>>103258637
do
>>
>>103284985
Well, if that's the case, then there's no reason to keep witnesses alive. Like in Delphi.
>>
>>103282858
>That's like being upset that morticians and autopsy people exist
This but unironically
>>
File: diversespook.png (747 KB, 2048x2048)
747 KB
747 KB PNG
>>103282206
hello fedboy
>>
>>103261459
Man, it is really easy to get bad ended in this game about cheesepizza
>>
>>103261459
shhh
>>
File: uh35356016(93).jpg (17 KB, 720x720)
17 KB
17 KB JPG
>>103260861
this
>>
File: 1727328427028492.jpg (83 KB, 792x783)
83 KB
83 KB JPG
>>103283000
>watch war videos of ppl getting killed
>doesn't make me a murderer
>watch porno of girls being fucked
>still makes me a virgin
>>
>>103265600
Bstaber
>>
>>103274024
research is research
>>
>>103282206
Sir, the conspiracy theories are about rich elites actually fucking kids, not looking at videos.
Retard.
>>
File: 1695908380319853.gif (31 KB, 128x128)
31 KB
31 KB GIF
>>103282451
>it encompasses more.
like 16 year-olds sending nudes to each other, it's self-produced CSAM
>>
I hate the jews so fucking much.
>>
>>103258637
high IQ white man move.
EPIC!
>>
>>103284156
No, that's why I ended the sentence with what I did.
>>
>>103284210
>it doesn't do anything
>it's just used as an attack against adversaries
Proof?
>>
>>103287806
The SA stands for sexual abuse. A 16yo sending nudes to their teenage crush isn't abuse. The wording is ok and an improvement. Definitely better than CP, because the P kinda implies consent.

The problem is women redefining it.
It is rape if you have sex with a consenting but drunk women, then any teenage pic can be redefined as sexual abuse as well.
>>
File: 1713221347229763.png (631 KB, 884x874)
631 KB
631 KB PNG
>>103288372
>the law isn't the problem
>women are
>>
>>103288380
I mean, both deserve to be mass-murdered.
>>
>>103288384
It's a bit extreme, but if that is the cost of a better society...
>>
>>103288394
And if the bit with society doesn't work out we'll at least have achieved justice.
>>
>>103288079
The proof is the fact that they have a large databases of PhotoDNA hashes of CSAM, that every single big tech organization has access to, but YOU aren't allowed to have it.
YOU don't even get an API, where you can send an hash to and that tells you if its CSAM.
YOU get NONE of those anti-CSAM measures to protect your little online forum.

Why would this be the case?
The only reason i can think of, is that they want to keep the option to plant CP on you.
>>
>>103288384
>>103288394
>>103288399
Nah, just the men who dress up like women and castrate themselves should be exterminated (or the very least, encouraged to rope themselves).
>>
>>103288423
No, literally all military, law enforcement, and women. Not a single one must be spared, at the risk of nuclear civil war if need be.
>>
>>103288380
>the law isn't the problem
True. The wording is right
>women are
Women and their enablers from the judiciary.
Like with everything women cause, they are never themselves directly responsible for it, but it's always by proxy. By someone with power simping for women.
A judge could simply say:
>wtf is this bullshit? You consented to it, whore, fuck off, this is not what the law is for and nobody can prove this "emotional rape".
>>
File: 1623536499035.png (470 KB, 1700x800)
470 KB
470 KB PNG
>>103288450
I can't wait for the ego death once they realize that without enablers they're dead in the water.
>>
>>103288423
>trannies are responsible for muddying the definitions of sexual abuse, porn and rape
No, this one is on women and the system enabling them.
There is nobody else to blame here. Trannies don't benefit from this. They might even be harmed by this.
>>
>>103258637
>>
>>103288596
>stil no model leak
stop teasing me, faggot
>>
File: 1718698142871963.jpg (1.39 MB, 1390x3139)
1.39 MB
1.39 MB JPG
>>103258637
They can train models on child porn while i would go to prison for 10+ years if i posted a photo of myself i took when i was 17
>>
>>103258637
cool anime filter bro
>>
>>103288696
rules for thee not for me
>>
>>103258637
if such a model is legal to use, would that not set a precedent that ai models don't count as being the source material?
>>
>>103258637
Thank you. Please release the working model so we can all be safe.
>>
>>103288408
To this day I'm still convinced that Apple backpedaling on the plan to scan all images locally wasn't the result of privacy concerns at all, but actually on the realization that people could end up extracting the list of flagged hashes from an iPhone.
I can imagine that some people's work would become much harder if every person running a forum or an altchan could have a tool to remove cp instantly.
>>
>>103288372
We may not consider it abuse but the law considers anything sexual involving a minor to be abuse. Well, except actually having sex since the age of consent is usually under 18. Or if it's a minor with another minor, I don't think there's a crime there but I know better than to actually go looking stuff like that up. Also you must be a pedophile if you don't consider any such material to be abuse. So sexts are self-abuse and I won't hear otherwise you worthless nonce and if I see anyone saying this in public you'd better believe I'll have to try to beat the shit out of the scumbag because we do not tolerate pedos.
>>
>>103258637
>https://arstechnica.com/tech-policy/2024/11/ai-trained-on-real-child-sex-abuse-images-to-detect-new-csam/

>An expansion of Thorn's CSAM detection tool, Safer, the AI feature uses "advanced machine learning (ML) classification models" to "detect new or previously unreported CSAM," generating a "risk score to make human decisions easier and faster."

Ok so it is a classification/detection model. It sounded like they were going to create a generative model, like a lora for cp. I couldn't understand why they would want to do that.
>>
>>103259003
>/s
Go back to >>>/r/eddit.
>>
File: laugh.gif (3.79 MB, 640x276)
3.79 MB
3.79 MB GIF
>>103258637
>>103258974
>MFW the government has an actual CP database, verified and tagged.
>MFW Unteralterbach was actually a documentary
Apparently we live in a two-tier justice system, where the government and the rich can do whatever the fuck they want while the peasants go immediately do jail at the slight smell of something illegal happening.
>>
>>103282758
Not him but I've seen those lolihaven.rs ads and I felt nothing, neither arousal nor disgust. I guess some of us sociopaths who don't care about children getting exploited nor attracted to them vet these stuff.
>>
>>103282758
>go into the office and look at cp all day while collecting a salary
Unteralterbach was a documentary.
>>
>>103288696
Sauce?
>>
>>103282206
>>103282303
>The FBI's expert CP watchers aren't pedophiles! They work for the government, that means they're perfect!
>>
>>103282303
>They have been co-opted in recent times to protect politicians and are corrupted by their department heads.
They're jews that protect other jews and honeypot or plant material on goy politicians for leverage. That's every division of FBI/CIA/NSA, not just the "anti"-CSAM divisions.
>>
>>103288372
actually it means child sexual arousing material, because of its lust provoking nature
>>
File: 1730871893694998.jpg (27 KB, 400x400)
27 KB
27 KB JPG
>>103289929
>I couldn't understand why they would want to do that.
>>
File: Becky_mesugaki_2.webm (1.15 MB, 1920x1080)
1.15 MB
1.15 MB WEBM
>>103258637
based
>>
>>103289610
This, no corporation on earth gives a shit about their users privacy. The only reason they backpedalled is because they realized researchers and skiddies alike would be all over their scanner.
>>
>>103269879
>You can generally brute force them into reproducing content that they were trained on. It's not perfect and it's still statistical but you can do it.
it's the default, most of the effort involved in generation making sure it lerps the data instead of reproducing it directly, so you can at least pretend it's not just a fancy zip file
>>
>fbi and interpol have the world's largest cp collections
remember this fact the next time you see the headline
>[political dissident] arrested for possession of 47456 terabyte of cp
>>
File: 1710208915441410(1) (2).png (280 KB, 1104x2096)
280 KB
280 KB PNG
>>103291026
Friendly reminder.
>>
>>103290072
Cherry no Manma
i haven't watched it but i don't think she is 12. saved the pic cuz it was funny
>>
File: bad for brains.png (182 KB, 323x412)
182 KB
182 KB PNG
99% of CP is self-produced by kids with bad parents who were given smartphones at a young age.
The only people who should get charged are the parents.
>>
>>103291354
>sir i'm putting you under arrest on the charges that your daughter is a slut
>>
File: 1729963977217325.jpg (89 KB, 1088x822)
89 KB
89 KB JPG
>>103291405
Negligent sluttery of a person's under 18?
>>
>>103291405
that would make parents marry their daughters off close to puberty. nice.
>>
File: 1731209410595979.png (3 KB, 368x317)
3 KB
3 KB PNG
>>103291354
>t.
don't blame others for being a whore
>>
>>103291354
>The only people who should get charged are the parents.
Also all cell phone producers. They are complicit for not putting RealID mandatory verification in place. Also cell phone companies, same reason. Also social media sites, same reason. We need Real ID everywhere, every packet.
>>
>>103291447
parents are blamed if their kids do other stupid shit, why should this be an exception?
>>
>>103291453
>t. Bioluminescent CIA employee of African descent
>>
>>103291488
because it's the inherent nature of being female to try appealing to stronger/older men even at a young age
you can at most delay any fuck up until they're old enough to deal with it themselves
>>
>>103291536
>women
>dealing with their own fuckups
oh i am laughing
>>
>>103291536
Then the age of marriage should be like, 15.
>>
>>103259003
>/s
you will never be a woman
>>
>>103258637
It's not a generative AI model, it's a discriminative AI model. For those idiots who don't know the difference, this means it's meant to *detect* CSAM, not produce it
>>
>>103291547
15 should be the age your children are already in grade school and you're focusing sorely on being a housewife
>>
>>103291571
They already have a database large enough to train an AI model, what's stopping them from creating a generative model for the sole purpose to injecting a terabyte of generated CP into their political adversaries and whatever websites they disagree with?
>>
>>103291585
why would they bother? they already have the real deal

>uh you put it there
>>we investigated ourselves and concluded that we were innocent, now go to jail for 40 years
>>
>>103291585
or on digital billboards for that matter.
>>
>>103258637
What's the purpose of that?
>>
>>103291571
>doesn't know how to create a generative model from a discriminating one
>>
>>103283402
Sounds like they baited them without having the actual thing, all they need is the target's request for the thing to make them fall.
>>
>>103291642
They don't even need to investigate anything, just dismiss it. See Matt Gaetz.
>>
>>103291851
>no standing
>>
>>103259119
Kek
>>
>>103258637
>Once suspected CSAM is flagged, a human reviewer remains in the loop to ensure oversight
>>
>>103260832
GAN trains a second model that will produce cp according to the initial classification model
>>
If 50,000 people drive by a digital billboard with CSAM, will all their license plates be flagged and will they be sent to re-education camps?
>>
>>103291938
>chatting in the break room with a coworker
>pager beeps signaling there's new flagged material to review
>get up and adjusts pants
>sigh and grab box of tissues nearby
>"well, got work to do. see ya later, bob."
>>
>>103292328
tissues too messy. get a bag of t-shirt rags off amazon.
>>
>>103292347
>"uhm, chief, I was looking through our monthly expenses, and... why do we have a contract with amazon for 10,000 kid-sized panties every month?"
>>
>>103291958
They will be given suspended sentences, put on the sex offenders register and fined. Nobody is going to jail them or send them to some camps because that would be too expensive.
>>
>>103292668
massively sending people to camps, even a comparatively small number like 50,000 is something that can only happen in fiction
>>
>>103292392
wasn't me, I bought out all their crotchless panties and diapers.

>>103292728
>can only happen in fiction
Or during a world war. FEMA camps are already set up for much larger numbers.
>>
>>103292945
Just need to have parades in every city.
>>
>>103289610
>>103290967
You're delusional if you think the scanning would be done locally. If it's in iCloud, they can do the scanning on their servers. If it's local device only, they just have to send the hash to their server to check.
That, and hash checking is how it was done a decade ago. But it's unreliable. Now they have classification models that analyze the image and output a similarity score.
>>
>>103293097
WOPR concluded that the only winning move, is not to play.
Use phones for voice and text. Everything else on a minimalist desktop.
>>
>>103293097
>You're delusional if you listen to what they said they were going to do
Yeah I'm sure they called it local scanning because it's such a hip and trendy marketing term you fucking retard.
>>
File: anti-semites.jpg (83 KB, 1680x945)
83 KB
83 KB JPG
>>103262077
I know exactly where you get yours.
>>
File: 1724112569160645m.jpg (80 KB, 1024x941)
80 KB
80 KB JPG
>>103292728
>massively sending people to camps, even a comparatively small number like 50,000 is something that can only happen in fiction
>>
>>103293154
>they said it's local so it must be true!
>>
>>103282206
>you do realize SOMEONE has to look at it, inspect it, document it, and store it securely
I don't know where, but I remember seeing some documentary/interview or something where they explained that the FBI apparently has a very high turn over rate for those positions because it just traumatizes and destroys people mentally to go through so high volumes of CSAM.
>>
>>103293520
That's because they did not hire from the darkweb. Some would work for free. Sometimes the solutions are just too simple.
>>
>>103293470
>yeah bro they just like called it local because uh... uh...
So this is what it looks like when someone has an IQ lower than their own body temperature.
>>
>>103292945
uoh
>>
just another thread with pedos acting like they can reason their way out of being persecuted.
>well ahcktually don't authorised people need to be able to look at the cp, check mate, mate
>she was 17 and 364 days old
>she says she loves me
normies want to kill you. nothing is going to change that any time soon.
>>
>>103293520
must be traumatizing having your marriage and family destroyed because you're not interested in hags anymore
>>
>>103293097
>>103293470
I legit can't tell what point you're trying to make.
>>
>>103293795
neither does he
>>
>>103259329
>anti-pedo AI is leaked and used to produce CP on the dark web
Wouldn't that be good, since it gives pedophiles an AI-made masturbatory substitute that doesn't involve abusing real children to produce? I mean I get the ethical concerns but hasn't our culture moved waaaay beyond that with everything that has become publicly acceptable now?
>>
>>103294449
I'd rather just kill culture.
>>
>>103294449
Maybe there is a middle ground. AI generated but using real people as a foundational model. But not kids, instead elderly people de-aged to be children so you can get consent. Diapers to diapers so to speak.
>>
>>103294463
Nah, killing culture is better. Can you imagine the butthurt once the pedos bring out the woodchippers to chip apart normies? I'd pay money to be able to see it.
>>
>>103294486
On demand snuff films? If we are going that route then I want to bring back gladiator tournaments, the real ones. 50 internet scammers go in, 0 come out.
>>
>>103294508
sar, do not redeem the sword...
>>
>>103294508
Nah let one get out and give him a $20 google play card. Then once a year you have a free for all with the previous winners, hunger games style.
>>
>>103294756
I guess that works. I was thinking 50 scammers go in per hour, 24 hours a day 364 days a year. Letting one out per tournament is a lot. Maybe those that get out go to the weekly elite match tourney.
>>
File: oy.jpg (440 KB, 683x1090)
440 KB
440 KB JPG
>>
Local when?
>>
>>103294773
that's not even close to reaching 6 million scammers at the end of the year. needs more colosseums.
>>
>>103294801
yes, funds from each build more and matched by all the governments via grants.
>>
>>103293520
Yes, this is true. Same with any line of work that involves highly disturbing material
>>103293772
>well ahcktually don't authorised people need to be able to look at the cp, check mate, mate
I want you to walk through the process, very slowly, of a forensics lab tech that just received a 3tb hdd with the orders "find any CSAM on this and document where the images were downloaded from so we can use it in court"
Is the sheriff supposed to arrest the lab tech afterwards for looking at CP? Does he go through the long and arduous process with his monitor turned off dare he see a naked child? Context matters for everything.
>>
>>103293520
Legends have it that when they catch a good black-hat hacker they offer him to work for them instead of going to jail. Maybe they should do the same with perverts, offer him to do that job. This may even cure the guy, he would see so much of this shit he would lose interest in it, just like when you eat so much of something you become disgusted of it for life and can never eat it again in your life.
>>
>>103295254
>CP is highly disturbing material
Do normies really?
>>
>>103295824
NPCs think every piece of cp is intense and violent rape with lots of blood and crying
in reality only a subset of people are into that and most groups are even against sharing that kind of stuff (similar to how people are segregated in regards to rape/bdsm in regular porn)
99% of cp you come across is either sole-female or stuff even milder than vanilla amateur porn
>>
>>103295797
...Because the extremely obese fatties you see on tv are clearly disgusted by fast food and pizza? (Pun not intended)
>>
>>103295920
kill yourself
>>
>>103296025
Imagine them liking it.
>>
>>103295920
> sole-female
This made me laugh because I'm not sure if it's a typo or just footfags being footfags.
>>
>>103296059
>adjective: sole
>one and only.
>"my sole aim was to contribute to the national team"
>>
>>103296032
kill yourself
>>
>>103296534
>>GET PREGNANT.
>YES!
>>
>>103262037
>I've read that on the contrary the books they burned were only pedo books and other degenerate shit
Let me guess, all the superlatives you've read about nazis come from the totally objective sphere of far right totally-not-grifters?
>>
>>103296569
kill yourself
>>
>>103296661
And if not?
>>
>>103294449
sup destiny
>>
File: 1704190033167065.jpg (25 KB, 700x438)
25 KB
25 KB JPG
>>103296661
uooh child belly erotic
next, you're going to say "kill yourself"
>>
>>103266821
>Even if it has a 1 % rate of false positives, it means that for every 1000 posts, 100 will be falsely flagged.
10
>On social media with millions of posts per hour that would mean about 1000 false positives each hour
10000 per million
imagine being an order of magnitude wrong in both directions
>>
File: 62x21z-1116660541.jpg (78 KB, 620x432)
78 KB
78 KB JPG
>>103292392
>>
>>103282206
>Once an image is hashed, it doesn't have to be looked at again, so it is stored on an extremely secure crypto server that only a few people have access to.
holy shit imagine being fucking retarded enough to believe this. how do you think vice departments stored this before file sharing?
>>
>>103297182
In a vault with limited access, most likely. Got any insight for us or are you just blowing smoke?



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.