He's calling for death penalty for people asking Grok to generate bikinis for those under 18. This could be HUGE.https://www.youtube.com/watch?v=quaLo8vET-Y
>>107747692Saaaaaar!
>>107747692>He's calling for death penalty for people asking Grok to generate bikinis for those under 18Sounds like a projecting pedo.
good morning saar
>>107747692What is worse? People asking for it, or the AI delivering it?
>>107747692Can't watch, but if true, that only signals he has some shady shit going on and is desperate to have people look in the opposite direction.
>>107747704>Sounds like a projecting pedo.similar to "pedo hunter"
Aren't consumer AIs supposed to be KIKED to the absolute maximum?
>talks a load of shit>hasn't tried itwhy am I watching this ugly fag talking?
If I use AI to generate a video of this fat jeet getting turned into red mist by a train, have I committed real tangible harm against him?If not, why is it suddenly the case if AI generates a little girl in a bikini?
>>107747932Because it's getting too realistic and we can't just let child rapists (like the on in OPs pic) get away so easily by claiming it was AI generated. The world would have a lot more epstein islands if we allowed that to happen.
>a literal who saying retarded shit
twitter is now doing the thing the users are asking it to do.
>>107747692Amerifats will say shit like this meanwhile their entire government is ran by child rapists and they're doing fuck all
>>107748000Where's the Amerifat in that video?
>>107747893They are, which is why the most nude you can get is bikini. However xAI are also massively incompetent so they let this slide.>>107747932It’s ok to generate it imo, but the problem happens when you share it, especially in the comments of the very post that you are making nude.
>>107747692i think i saw that guy in one of the daycare center news in minnesota
>>107747692Americ*ns are so weird lole
is ALWAYS the projecting antis that get caught with seepee, never tell your illegal porn, just like you never do illegal drugs in front of cops and project to the public never do drugs
>>107748000That’s Canadian born to Indian parents in Canada, he is also “secular Muslim”(not abiding by any of their rules).
>>107747692Didn't he rape is gf or something? God, jeets are gross.
>>107747692>
>Americans freely give their high tech tools to thirdies>Let them abuse their water and electricity to use it>Then let them generate gooning material for them.And thirdies will STILL hate America. It baffles me, but thankfully I'm uninvolved.
>>107747692uh oh you stinky jeet, not muh fake pixels
>>107747692Elon Musk personally intervened to unban a Xitter account that posted a toddler being sexually tortured. This is nothing new.
>>107748268that guy must be a hardcore bluesky & discord user
>>107747692Can you not use a local StableDiffusion to achieve the same? I don’t understand how Grok is different from what came previously? I also don’t want to taint my Google search history asking a question that I’m only sort of curious about.
Pedophilia hysteria exists as a useful tool for justifying censorship and easy, no-questions-asked reputation smear campaigns. AI just revelling how much it was never about protecting real children from abuse.Have a nice day you all.
>>107747710Elon Musk developed GrokTherefore it is Elon who produced indecent images of childrenBlame him
Canadians can say any dumb shit and keep their national image intact because people just automatically assume they're americans, kek
>>107748364on all X pictures on the bottom-right, there is this link to edit an image which opens up the image in a popup with the grok chat prompt. the normies and shitlibs are throwing a fit because it makes it easier, thats it.
What's with all these faggots ITT acting like it's no big deal, like the porn being AI generated is just anime loli stuff?The scandal is that the grok is editing images of REAL children in sexual contexts, unsolicited by predators openly in Twitter.
>>107748387>AI just revelling how much it was never about protecting real children from abuse.AI is harming real children, you faggot retard.That's what this controversy is about.
>>107748430>oh, no, someone might generate photos of my 6 years old daughter nakedSo? Someone might masturbate imagining her naked. Should we criminalise thoughts? Everything is so tiresome.
>>107748430“Harming”
>>107747704A hundred percent true. I say this as a pedophile myself who once felt guilty and thought I would like to personally kill all pedophiles and then realized all I need is to find some professional help and make sure I don't harm anybody
>>107748409Thank you for explaining, anon.
>>107748451except pictures are not thoughts and this strawman is dumb
>>107747957No. We learned from the Epstein files that the illegal part is that “every time the photo is seen the subject of the photo, the victim, is harmed”Since there is no actual victim, there is no harm, and nothing illegal in an AI generated depiction according to US law.
Wow, who could've seen this coming?
>>107748485kill yourself.
>>107747692so we're back to the anime porn logic again.
>>107748498>>107747932Because actual children were used to help generate said images in the training data. This is inherently harmful, though I would agree to say that it is to a much lesser degree. Still more harmful than anything drawn though.
>>107748538Isn't the crime there that the material was used in the first place? It literally shouldn't be in the training sets, regardless of it's then used for generation or not.In other words, Musk, Altman etc should be in jail already.
>>107748538> AI trained on real photosOk, good point.Also, I was wrong about the case being from the Epstein files, it was the FBI using actual material to entrap people into downloading it, which made them one of the largest distributors of that material in the world. Not surprising.I’m wondering how they quantify the “harm” and how much $ the victims get pro-rated from the amount the photos contributed to the generated images. That’s an impossibly complicated calculus to embark on. Also for fines and sentencing.
>>107748563> in the training setsWe’ve had algorithms to age-up and age-down faces for decades now. It doesn’t necessarily need actual pictures to generate it.There are models out there that are over 18 but look extremely young. That's why video producers keep detailed records. Being over 18 is the *only thing that matters* when filming that kind of material.
>>107748563I agree entirely with this. Including all those books.>>107748606This is a troubling discussion because a part of me can't be very nuanced about it, the second I hear anything about this shit being distributed I'm disgusted and want these people locked away. But if I try to be rational, things like that with consent from the victims in the material, could potentially help stop future crime.I'd take it a scary but cautious step forward and say that if there was proof that having access to the aforementioned "consent given" materials would help current pedos files not harm kids, I would be willing to discuss that. It seems kinda crazy tho and immediately flags scary shit in my head. Sad we've come to this but whatever, guess it's human nature to a degree. If anyone reading this is a PDF file, get help.
If pedos just did this locally on their own computer at home no one would know and thus no one could care. But no, pedos are retarded and any public service will be subjected to them shamelessly uploading their shit to it and inevitably kill the service. RIP so many anonymous file hosts.
>>107748635No "young looking adult" is going to look like a 4 years old.They almost certainly have little kids in the data set, whether it's straight up CP or not, I don't know, but you shouldn't have pics of minors in your data set without parental consent, imo.
>>107748406>This, but unironically.
>>107748660> look like a 4 years oldThat’s pretty disgusting, i hadn’t considered that age as potentially being the subject matter.I was already disgusted at the beauty pagent industry for kids reading about the JonBenét Ramsey case. How was that not frowned upon?
>>107748696most based african ever desu
didn't war criminal, predator-defender benjamin netanyahu sit down to "have a chat" with musk before all this shit went up?
>>107748512t. pedo
>>107747692just remove all image/video generation.fuck the screeching digital "artists" though.
Fuck retarded normies! There is nothing wrong with pictures of children in the training dataset because those pics are safe! Most child pics are posted by their most normie parents. AI simply generalizes how human body looks under clothes.And fuck this "harm" narrative. There are no victims in AI generated context. It actually reduces harm because people get a harmless vent.All this stuff is just an excuse to tighten censorship for cattle that is happy to get castrated.
>>107748414Occam's razor anon; they're the pedos generating and consuming it
These virtue signaling retards are going to usher in the most draconian laws, that will give the state a right to actively scan your computer for "harmful material" which of course will include the subject of wrong think, which is whatever the state wants it to be.
>>107747692mutahar is a pedophile. he goes on 4chan, uses tor, when explaining why he uses tor, he was about to reveal the real reason (cheese pizzas), then he took a long pause and said it was to find racists, like he thought it was the best thing to deflect to.
>>107748407That's because to the rest of the world you are american lol
>>107748833>>107748898>>107748451>>107748387Sorry pedo, but it's actually bad when people use AI editing to edit images of real women and children for sexual harassment purposes, and getting away with it.Especially when it happens in an a public environment like Twitter.
Every single time someone is outspoken like this it turns out they're a pedophile themselves and this is how they justify their own actions.He has terabytes of pizza on his computer.
>>107748948sounds like you're doing the same THOUGH
>>107748946It's actually not good to coddle the feelings of rando whores because you think the internet (words on a screen) is a public place.Perhaps the whole social media experiment wasn't meant to be permanent?
>>107747692>>107748430>>107748538>>107748696>>107748946>falling for more deliberate alarmism that exists to encourage (((regulation))) for open source technology
>>107748965I'm not outspoken about people generating pizza with AI.I don't care.
>>107749004you clearly do care. enough to make a post about it
>>107748999Grok is open source?
>>107748968>>107748999Child porn is bad
>>107747692>could be HUGE.Nobody cares what Mutahar Singh has to say.
>>107748833>And fuck this "harm" narrative.Maybe there would be no harm if it were in your personal PC, but doing it right under their post is retarded and deranged.
>>107749184>personal PCLol, I meant Personal Computer, whatever.
>>107749184If you're such a snowflake who feels unbearable pain from edits of the pics you posted yourself, there is a simple solution: stop posting your pics.But I'd actually argue that it's good! It teaches you to grow thicker skin and privacy rules.
>>107748538The reason why cp is illegal is because its production harms children. If you trained AI with real cp, then you could argue it promotes the production of cp for AI training.If you use AI to edit existing photos of girls so that they appear naked, no child is harmed. Edited pictures are fiction, the act of taking naked photo of her didn't happen. Though, publicly posting such pictures might count as some kind of harassment.
>>107747720>I'm not but I am a shill
>>107749303photoshopping real people is illegal before AI existed.
>>107749037The narratives around grok will be used as a way to encourage the regulation of open source AI. Sure you know how law makers think?
>>107749303>might count as some kind of harassment.>might>might>>>>>>>>>>>mightI'm 100% certain it is.