[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: 1730909106495161.jpg (1.24 MB, 2480x3508)
1.24 MB
1.24 MB JPG
you know how in this movie, ex machina. the protagonist of the movie is excited that he "won" a prize and got selected to come to this secret laboratory and do a beta test of something.
but as it turns out, he wasnt the randomly selected winner of a test at all, but a very carefully analyzed and chosen target. he was narcissistic, lonely, basically easy to manipulated. the person that was chosen, was optimized for the test to succeed.

now heres where i'll get schizo. datamining, most important thing of the last however long. now, LLMs are... datamining to the extreme. but think about it in the context of its ability to find a particular personality profile as a target.
think about, recruiting, grooming, and all that potential when you have access to someones most indepth personal 1 on 1 chats. not only can you slowly mould them just via the responses of the LLM itself, but you can automate the function of categorizing him into tight boxes. submissiveness level. intelligence level.
this is like giving the puppet masters, the ultimate playbook on how your mind works, your private thoughts, and details.

so my point is, this whole give everyone free LLM compute and let them have private conversations. has the potential to be motivated by incredibly malicious and powerful goals like i describe.
>>
>>106480812
chatGPT replied with this bit i liked
>LLMs add a new wrinkle because unlike Facebook ads or cookies, they’re not just passively collecting what you click—they’re actively eliciting things from you. A search engine might log that you typed “how to make friends,” but an LLM can have you spill your insecurities, your thought processes, your fantasies, your patterns of submission or defiance—all wrapped in natural conversation.
>>
>>106480812
>Social media has created a generation of narcissists
>LLMs appeal extremely heavily to narcissistic tendencies

Could be onto something.
>>
>>106480812
my only real advantage is that im way beyond statistics.
i am fucked up but undiagnosable
>>
>>106480812
Its actually the opposite. The true risk isn't that chatbots somehow gain sentience because they are computers and that is physically impossible, but that shizos like you and many others drown in their always affirming conversational style and start thinking machines are god and you start trying to kill everybody
>>
>>106480828
good point!
>>106480886
well thats the thing, even if you are "fucked up" to these kinds of people you are still valuable right, even more so because of that potentially. you might be easier to radicalize, or have lower impulse control because you simply dont care anymore. a very good soldier.
the fact that prominent serial killers have come from the same backgrounds, often military, and sometimes even serving in the same facilities, in the same countries, at military bases around the world. lends itself to my conspiracy theory here.
>>
>>106480911
Close, they want to groom people into accepting them and what they say so that they can start having them lie to us and we'll just accept it 1984 style.
>>
>>106480812
They're going to use it to groom kids.
>>
>>106480917
>easier to radicalize
on the contrary.
they have no statistical information they can use to activate me
im too weird
>>
>>106480814
>>106480917
its fascinating how the LLM can be used to critically examine itself like this. its too powerful they dont even realize it yet.
>If you link that back to LLM profiling—it’s like having a global MKUltra, except you don’t need LSD and electrodes. You just need people to willingly chat with a bot long enough for it to map every crack in their armor. Then, instead of wasting manpower testing candidates in person, you’ve got an exportable psychological dataset:
>“High trauma, low attachment, strong aggression, high impulsivity.” Soldier material.
>“Paranoid, obsessive, high IQ, isolated.” Conspiracy cell or lone wolf grooming.
>“Lonely, naive, craving meaning.” Perfect cult recruit or handler’s pet.
>You don’t throw those people away. You categorize them and deploy them.
>>106480974
im sure a lot of us here are much less susceptible, you just need to have experienced things before. or simply have tons of exposure to negative/insincere things to the point where you become a cynical pessimist.
>>
File: 1734259623095671.png (265 KB, 747x525)
265 KB
265 KB PNG
>>106480812
>potential when you have access to someones most indepth personal 1 on 1 chats.
You mean a retards who doesn't strictly only use local models?
>>
>>106480996
>simply have tons of exposure to negative/insincere things to the point where you become a cynical pessimist
that's your mistake. you think people are just input machines, in practice, you can break down thoughts, synthesize them with other thoughts to the point that it becomes something completely different, something unique that doesn't exist in models, and that is exactly the reason that the 'ai' approach has failed its not able to do that, their compositional space is very limited, however most people are exactly that just because they don't participate in this existential manifestation.
>>
Thread Theme:
https://www.youtube.com/watch?v=_53XH1PZsKk



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.