[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/news/ - Current News

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: 1695692274701515.jpg (212 KB, 1081x1098)
212 KB
212 KB JPG
https://www.nytimes.com/2024/05/13/us/instagram-child-safety.html

When a children’s jewelry maker began advertising on Instagram, she promoted photos of a 5-year-old girl wearing a sparkly charm to users interested in parenting, children, ballet and other topics identified by Meta as appealing mostly to women.

But when the merchant got the automated results of her ad campaign from Instagram, the opposite had happened: The ads had gone almost entirely to adult men.

Perplexed and concerned, the merchant contacted The New York Times, which in recent years has published multiple articles about the abuse of children on social media platforms. In February, The Times investigated Instagram accounts run by parents for their young daughters, and the dark underworld of men who have sexualized interactions with those accounts.

With the photos from the jewelry ads in hand, The Times set out to understand why they attracted an unwanted audience. Test ads run by The Times using the same photos with no text not only replicated the merchant’s experience — they drew the attention of convicted sex offenders and other men whose accounts indicated a sexual interest in children or who wrote sexual messages.
>>
The Times opened two Instagram accounts and promoted posts showing the 5-year-old girl, her face turned away from the camera, wearing a tank top and the charm. Separate posts showed the clothing and jewelry without the child model, or with a black box concealing her. All of the paid ads were promoted to people interested in topics like childhood, dance and cheerleading, which Meta’s audience tools estimated as predominantly women.

Aside from reaching a surprisingly large proportion of men, the ads got direct responses from dozens of Instagram users, including phone calls from two accused sex offenders, offers to pay the child for sexual acts and professions of love.

The results suggest that the platform’s algorithms play an important role in directing men to photos of children. And they echo concerns about the prevalence of men who use Instagram to follow and contact minors, including those who have been arrested for using social media to solicit children for sex.

On Wednesday, New Mexico’s attorney general, Raúl Torrez, announced the arrest of three men who were caught in a sting operation trying to arrange sex with underage girls on Facebook. Calling it “Operation MetaPhile,” Mr. Torrez said Meta’s algorithms had played a key role in directing these men to the “decoy” profiles created by law enforcement.

“We could set up a brand-new undercover account, presented as an underage child on that platform, and likely within a matter of minutes, if not days, that child would be inundated with sexually explicit material,” he said, emphasizing the real-world harm that can be caused by online platforms.
>>
The investigation by The Times in February found that thousands of parent-run Instagram accounts attracted sexualized comments and messages from adult men. While some parents described the attention as a way to increase their daughters’ followers, others complained of spending hours blocking users and said they did not understand how the men had found the accounts.

An analysis of the users who interacted with the ads posted by The Times found an overlap between those two worlds. About three dozen of the men followed child influencer accounts that were run by parents and were previously studied by The Times; one followed 140. In addition, nearly 100 of the men followed accounts featuring or advertising adult pornography, which is barred under Instagram’s rules.

Dani Lever, a spokeswoman for Meta, dismissed The Times’s ad tests as a “manufactured experience” that failed to account for “the many factors that contribute to who ultimately sees an ad,” and suggested that it was “flawed and unsound” to draw conclusions from limited data.

When asked about the arrests in New Mexico, Meta said in a statement that “child exploitation is a horrific crime and we’ve spent years building technology to combat it.” The company described its efforts as “an ongoing fight” against “determined criminals.”
>>
‘The Men Engage’

Researchers and former employees who worked with algorithms at Meta, which owns Instagram and Facebook, said that image classification tools probably deserved some blame.

The tools compare new images with existing ones on the platform and identify users who previously showed interest in them, said Dean Eckles, a former Facebook data scientist who studied its algorithms and is now a professor at the Massachusetts Institute of Technology.

Test accounts set up last year by The Wall Street Journal found that Instagram’s recommendation algorithm served sexualized photos of children and adults to accounts that followed only young gymnasts, cheerleaders and other children.

Although Meta’s ad system is not exactly the same as that recommendation system, there are “huge similarities between the models,” Dr. Eckles said.

Former Meta employees familiar with its recommendation and ad delivery systems said that safety teams tried to spot harmful ads, like those promoting scams or illegal drugs, but it was more difficult to identify benign ads that were delivered to inappropriate — and potentially harmful — audiences.

Meta allows advertisers to target certain audiences by topic, and though The Times chose topics that the company estimated were dominated by women, the ads were shown, on average, to men about 80 percent of the time, according to a Times analysis of Instagram’s audience data. In one group of tests, photos showing the child went to men 95 percent of the time, on average, while photos of the items alone went to men 64 percent of the time.
>>
Piotr Sapiezynski, a research scientist at Northeastern University who specializes in testing online algorithms, said advertisers competed with one another to reach women because they dominate U.S. consumer spending. As a result, Dr. Sapiezynski said, the algorithm probably focused on highly interested, easier-to-reach men who had interacted with similar content.

“The men engage,” he said. “The machine is doing exactly what you want it to do.”

Meta, in a statement, acknowledged the competitive ad environment for female viewers and said the “low quality” of the Times ads — from new accounts, with images but no text or explanation — contributed to their being delivered to more men. In addition, Meta said, its Audience Insights data only “shows an estimate of who is potentially eligible to see an ad,” not a guaranteed audience.

Dr. Sapiezynski said even if the system designated the test ads as “low quality,” that did not explain why those featuring children went to more men than those without children.

‘Hey Babe’

A few hours after the first ad was posted, one of The Times’s test accounts received a message and a phone call from a man arrested in 2015 in Oklahoma after allegedly using Facebook to try to arrange group sex with girls aged 12 and 14.

“Hey babe,” another man wrote. He had been arrested in 2020 after contacting a 14-year-old girl in upstate New York over Snapchat and offering to pick her up for sex. Charges against him were dismissed after a court found him mentally incompetent.
>>
A third man, in Tennessee, who “liked” one of the photos had four convictions for child sex crimes — including “sex with a child” in 1999, sharing a photo on Facebook in 2018 of a 3- to 5-year-old “being anally or vaginally penetrated,” and using Instagram in 2020 to solicit nude photos from a 12-year-old girl he called his “sex slave.” (Instagram’s rules ban 12-year-olds.)

A fourth man, whom The Times was unable to identify, offered to pay for sexual acts with the girl in the photograph.

The Times reached out via Instagram chat to anyone who had engaged with the ads and explained that they were tests of the platform’s algorithm being run by journalists. The man in New York continued to send messages inquiring about the girl, asking if she was in her bedroom and if she wanted to have sex. He also tried to call her multiple times through the app.

In total, The Times identified four convicted sex offenders who had messaged the accounts, liked the photos or left comments on them. Their Instagram accounts used real names and pictures, or were linked to Facebook accounts that did. Convictions were found by matching that information with sex offender databases and other public records.

Five other men, including one who posted a video on Instagram of a girl known to be a victim of child sexual abuse, according to the Canadian Center for Child Protection, have arrest records involving crimes against children. Those men whose court records The Times was able to review either pleaded guilty to a lesser charge or were deemed mentally unfit to stand trial.

Instagram’s rules prohibit convicted sex offenders from holding accounts, and The Times used Meta’s tool to report the men. The accounts remained online for about a week until The Times flagged them to a company spokesman.

Asked about the accounts, Ms. Lever said, “We prohibit convicted sex offenders from having a presence on our platforms and have removed the accounts reported to us.”
>>
One of the men, who was convicted in New York of sexually assaulting a 4-year-old girl, falls under a state law — known as E-Stop — that requires sex offenders to register their email address. Every week, the state shares the addresses with technology companies, including Meta.

Ms. Lever did not address how the company uses this information or how the man was able to create an Instagram account.

Some of the men said they responded to the ad out of concern.

One man, who is on parole after spending 46 years in prison in California for murdering his wife, said he was surprised to come across a 5-year-old girl in his feed, which predominantly shows photos of scantily clad or nude adults.

“I got no problem looking at naked women, especially after 46 years in prison,” he wrote. But, he continued, “my attitude about people that engage in child porn or touching a child is pretty simple: Don’t do it.”

The men’s engagement with the ads did not surprise some small business owners interviewed by The Times. Morgan Koontz, a founder of Bella & Omi, a children’s clothing business in West Virginia that promotes itself on social media, said the company received “inappropriate, almost pedophile-type, perverted comments” from men when they started advertising on Facebook in 2021.

“It made our models uncomfortable, and it made us uncomfortable,” she said.
>>
When the company expanded to Instagram, she and her fellow owner, Erica Barrios, decided to avoid the problem by targeting only women, even though fathers and grandfathers are among their regular customers.

Lindsey Rowse, who owns Tightspot Dancewear Center in Pennsylvania, also restricts her ads to women. When she did not exclude men, she said, they made up as much as 75 percent of her audience, and few bought her products. Separately, she limits how often she shares photos of child models in her non-advertising posts because they often attract men, she said.

“I don’t know how people find it,” she said. “I would love to just block all guys.”

Other business owners expressed similar confusion about how their ads were distributed. Since January, the Utah-based children’s clothing company Young Days has seen more than a doubling of the share of men its ads reach with no major changes in its targeting criteria, according to Brian Bergman, who oversees e-commerce. The shift toward men has hurt sales, he said, and the company has since focused on reaching women.

“It’s not a lucrative business for us, but the algorithm keeps pushing us toward men,” he said.
___
>>
>>1294806
>Perplexed and concerned, the merchant
Bitch used a 5 yr old as pedo bait so the pedos would buy the jewelry and dress their child sex dolls with it.

The world has always been this fucked up, but now it's coming into my house and I want them to get off my lawn
>>
>>1294853
Wouldn't it be Instagram's fault for serving the ad featuring young girls to 95% men?
>>
>>1294881
They can take some of the blame, but they're ultimately just giving people what they want. The algorithm is doing exactly what it was designed to do - show people what they want to see. If they axed the algorithm, those people would still find those girls.
You're better off blaming Social media as a whole, or the Internet as a whole, or just human nature in general.
>>
This is why India, Pakistan, and Canada should be banned from the Internet.
>>
>>1294889
The people responding to ads with pervert comments were Americans.
>>
>>1294896
Canadians are technically Americans, I suppose.
North Americans, anyway.
>>
>>1294904
Disregard this. They were all Mexicans.
I forgot about the Mexicans.
https://www.koat.com/article/attorney-generals-office-operation-catch-child-predators-social-media/60737581
53 seconds in vid.
>>
>>1294904
If anything Canadians are worse than Americans. Their sense of entitlement is off the charts. I've lived in Canada all my life, so I know.
>>
>>1294813
>Man continues to harass journos even after being explained to that it's not an actual child account.
>Is determined in court to be too stupid to charge.
Many mysteries in this universe.
>>
>>1294927
we're better than burgers.
what a shit health system, political system, racist system, wealth inequality system. cop brutality system, and world dominating system
>>
>>1294931
20% of chyyNese is scum
15 %of russians
10 % of leafs
50 % of israelis
55 % of burgers
2 % of danish
>>
>>1294932
Found the British Colonialist. I'll bet your name is Jareth or Nigel.
>>
>>1294806
LMAO, only child fuckers have accounts in that meta shit. Everybody knows that.
>>
I've long noticed that Instagram REALLY tries to shove pedophilia down your throat.

It's part of the process.
>>
>>1297761
Just wait until everyone on there if forced to make an Oculus Meta avatar. It's about to get much worse. They're already starting
https://www.meta.com/avatars/



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.