we talk a lot of shit about LLMs, but let’s be real: with the bad there’s always some good.let’s use this thread as a chance to give thanks for the times this tech has actually helped us.yesterday, i asked it about a medical issue because i was in a lot of pain. people i talked to said i should go to the hospital. i explained what had happened in the previous days, my symptoms, and it broke down logically why i was experiencing them in relation to those events. it even gave me some simple tests to do at home to make sure it wasn’t anything more serious. i used that advice to stay home, hit myself with some cold shock in the tub, then rest in a position it recommended to relieve the most stress on my body and help it heal.today i woke up feeling completely fine. if i had actually gone to the hospital like people told me, i would’ve been stressed out for hours, uncomfortable, unable to sleep, and maybe they would’ve done weird unnecessary shit to me when all i really needed was a nap.
>>106471515>it even gave me some simple tests to do at home to make sure it wasn’t anything more serious. i used that advice to stay home, hit myself with some cold shock in the tub, then rest in a position it recommended to relieve the most stress on my body and help it heal.... and then you diedretard
>>106471515yeah?llms are a correlation machineit correlates things togetherthats how it "learns" and then thats how it generates responsesand so it stands to reason it will be good at correlating things with eachotherlike correlating symptoms with an ailmentwhen we say llms are shit (most of us) talk in the context of programming
>>106471515>>106471528but also this anon is rightyou shouldnt be using an llm as replacement for an actual doctorits a correlation machineyou have to possess a certain level of knowledge to ask it the right things to correlate, and reject hallucinations
>>106471528>tfw>>106471531still its unfair to talk shit about them programming too. theyre simply amazing at it, it all depends on the context though. did you dump all the files in the project? did you autistically explain how you want the things to work? it needs an intelligent operator, and to be able to see your entire codebase to be effective and that is costly.
I just use it like a little assistant so I don't have to waste hours googling shit.I think that its best use case since it already hit the development ceiling.
>>106471515>let’s be real: with the bad there’s always some goodwhy don't you scoop my shit out of the toilet with a spoon and eat it? with the bad there's always some good
>>106471542>shit about them programming too. theyre simply amazing at it,ill stop you right thereno, theyre not. and its not even discutable.and we have better ais than llms for programming too, ones that actually work>did you autistically explain how you want the things to work?it takes more time to do that to write the code yourselfand very quickly you run out of context spacellms are just unsuitable for programming work, full stopanyone who says otherwise either tries to sell a product or never used an llm to program something non trivial
>>106471515of course its retarded to use it as a doctor for a serious issue. but there are tons of "health issues" where its like >you couldve just googled this yourself and deep dived into it instead of coming in to talk to mebecause a lot of advice is going to be basic knowledge, and people just want to hear it from an authority. im not being a retard and saying you dont need hospitals. just that in this instance, it acted just like a doctor would, giving me basic ass advice and making me comfortable that i wasnt dying
>>106471542>>106471567aaactuallythere exist solutions to shoehorn llms into things like model based developmentwhere there is an intermediary representation which guarantees correctnessthis would work in the following way:gen ai generates a schema, so to speakthe programmer validates itand then a fully symbolic ai translated the schematics into codethis is a way where llms (actually gen ai) could be recycled into doing a good jobbut no product that would use that technology exists today.its still in the domein of researchafaik at the forefront of it
>>106471567>and we have better ais than llms for programming too, ones that actually workname drop? i'd imagine they use some form of llm to help the processes cause why not>it takes more time to do that to write the code yourselfgo time yourself making a website to specifications vs asking an llm to shit it out. if youre talking about different languages then sure, its not going to be good at the more complex languages. that doesnt make your point correct though. if youre at a master tier level and can run circles around it, it doesnt lower its value for a junior, or someone that doesnt particularly care about the details and wants the end result. it creates the end results. >and very quickly you run out of context spaceyoure using the shitty companies (try google)
>>106471648>namedrophere: >>106471633model based development>go time yourself making a website to specifications vs asking an llm to shit it out.this it trivialif i were to shove my codebase into an llm it would give up at the headers stagepicrel is how it looks when i havent yet written all of what i would call "boilerplate"and theheaders are already 330 lines longi routinely end up with headers bw 500 and 1000 lines of codeand said headers are just the "roadmap" for the rest of the program (in case you dont know c)
>>106471648>doesnt particularly care about the details and wants the end result. it creates the end results.this is called a boss btw.the llm lets you become a boss, and it's your employee now.
>>106471648>>106471685and ofc i forgot picrel
>>106471531Every causation is just a very strong correlation.
>>106471694yeah but causation is a subset of correlations and thats why llms are "dumb"they cant think a priori
>>106471694>>106471701i meanllms cant think a prioritherefore they dont know the difference bw correlation and causation bc they cant theorizetherefore theyre "dumb"
>>106471515Which model were you using?
>>106471729was using chatgpt so probably GPT5 in this convo>i cant pee chatgpt, help me im freaking out>wtf go to the hospital bro>well i also got slammed into concrete>oh okay... changes things then. might heal f a m>okay thanks f a m
>>106471776youre using it wrongyou cannot take its advicethe correct way of using llms is:>hey chud gpt. dress me a list of ailments that correlate with the following symptomsand then YOU take decisions