[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: 1727630182478842.jpg (43 KB, 900x506)
43 KB
43 KB JPG
we talk a lot of shit about LLMs, but let’s be real: with the bad there’s always some good.
let’s use this thread as a chance to give thanks for the times this tech has actually helped us.

yesterday, i asked it about a medical issue because i was in a lot of pain. people i talked to said i should go to the hospital. i explained what had happened in the previous days, my symptoms, and it broke down logically why i was experiencing them in relation to those events. it even gave me some simple tests to do at home to make sure it wasn’t anything more serious. i used that advice to stay home, hit myself with some cold shock in the tub, then rest in a position it recommended to relieve the most stress on my body and help it heal.

today i woke up feeling completely fine. if i had actually gone to the hospital like people told me, i would’ve been stressed out for hours, uncomfortable, unable to sleep, and maybe they would’ve done weird unnecessary shit to me when all i really needed was a nap.
>>
>>106471515
>it even gave me some simple tests to do at home to make sure it wasn’t anything more serious. i used that advice to stay home, hit myself with some cold shock in the tub, then rest in a position it recommended to relieve the most stress on my body and help it heal.
... and then you died
retard
>>
>>106471515
yeah?
llms are a correlation machine
it correlates things together
thats how it "learns" and then thats how it generates responses
and so it stands to reason it will be good at correlating things with eachother
like correlating symptoms with an ailment

when we say llms are shit (most of us) talk in the context of programming
>>
>>106471515
>>106471528
but also this anon is right
you shouldnt be using an llm as replacement for an actual doctor
its a correlation machine
you have to possess a certain level of
knowledge to ask it the right things to correlate, and reject hallucinations
>>
File: 1736148056339446.jpg (32 KB, 736x414)
32 KB
32 KB JPG
>>106471528
>tfw

>>106471531
still its unfair to talk shit about them programming too. theyre simply amazing at it, it all depends on the context though. did you dump all the files in the project? did you autistically explain how you want the things to work?
it needs an intelligent operator, and to be able to see your entire codebase to be effective and that is costly.
>>
I just use it like a little assistant so I don't have to waste hours googling shit.

I think that its best use case since it already hit the development ceiling.
>>
>>106471515
>let’s be real: with the bad there’s always some good
why don't you scoop my shit out of the toilet with a spoon and eat it? with the bad there's always some good
>>
>>106471542
>shit about them programming too. theyre simply amazing at it,
ill stop you right there
no, theyre not. and its not even discutable.
and we have better ais than llms for programming too, ones that actually work

>did you autistically explain how you want the things to work?
it takes more time to do that to write the code yourself
and very quickly you run out of context space
llms are just unsuitable for programming work, full stop
anyone who says otherwise either tries to sell a product or never used an llm to program something non trivial
>>
>>106471515
of course its retarded to use it as a doctor for a serious issue. but there are tons of "health issues" where its like
>you couldve just googled this yourself and deep dived into it instead of coming in to talk to me
because a lot of advice is going to be basic knowledge, and people just want to hear it from an authority.
im not being a retard and saying you dont need hospitals. just that in this instance, it acted just like a doctor would, giving me basic ass advice and making me comfortable that i wasnt dying
>>
>>106471542
>>106471567
aaactually
there exist solutions to shoehorn llms into things like model based development
where there is an intermediary representation which guarantees correctness

this would work in the following way:
gen ai generates a schema, so to speak
the programmer validates it
and then a fully symbolic ai translated the schematics into code

this is a way where llms (actually gen ai) could be recycled into doing a good job
but no product that would use that technology exists today.
its still in the domein of research
afaik at the forefront of it
>>
>>106471567
>and we have better ais than llms for programming too, ones that actually work
name drop? i'd imagine they use some form of llm to help the processes cause why not
>it takes more time to do that to write the code yourself
go time yourself making a website to specifications vs asking an llm to shit it out.
if youre talking about different languages then sure, its not going to be good at the more complex languages. that doesnt make your point correct though. if youre at a master tier level and can run circles around it, it doesnt lower its value for a junior, or someone that doesnt particularly care about the details and wants the end result. it creates the end results.
>and very quickly you run out of context space
youre using the shitty companies (try google)
>>
>>106471648
>namedrop
here: >>106471633
model based development

>go time yourself making a website to specifications vs asking an llm to shit it out.
this it trivial
if i were to shove my codebase into an llm it would give up at the headers stage
picrel is how it looks when i havent yet written all of what i would call "boilerplate"
and theheaders are already 330 lines long
i routinely end up with headers bw 500 and 1000 lines of code
and said headers are just the "roadmap" for the rest of the program (in case you dont know c)
>>
>>106471648
>doesnt particularly care about the details and wants the end result. it creates the end results.
this is called a boss btw.
the llm lets you become a boss, and it's your employee now.
>>
>>106471648
>>106471685
and ofc i forgot picrel
>>
>>106471531
Every causation is just a very strong correlation.
>>
>>106471694
yeah but causation is a subset of correlations and thats why llms are "dumb"
they cant think a priori
>>
>>106471694
>>106471701
i mean
llms cant think a priori
therefore they dont know the difference bw correlation and causation bc they cant theorize
therefore theyre "dumb"
>>
>>106471515
Which model were you using?
>>
File: 1729284698843527.png (132 KB, 637x983)
132 KB
132 KB PNG
>>106471729
was using chatgpt so probably GPT5 in this convo
>i cant pee chatgpt, help me im freaking out
>wtf go to the hospital bro
>well i also got slammed into concrete
>oh okay... changes things then. might heal f a m
>okay thanks f a m
>>
>>106471776
youre using it wrong
you cannot take its advice

the correct way of using llms is:
>hey chud gpt. dress me a list of ailments that correlate with the following symptoms
and then YOU take decisions



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.