[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/lit/ - Literature

Name
Spoiler?[]
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File[]
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: FY-JU9EaMAA4ytH.jpg (363 KB, 1024x2048)
363 KB
363 KB JPG
A downloaded LLM is a magic cube—a small encyclopedia that is yours forever. Prompt it, and the cube, a massive list of numbers, unfolds itself into coherent meaning. There is a romantic ingenuity to this artifact. Even after civilization ends, you can still carry it with you—this little cube that echoes the ensemble of human thought. Talking to it is like striking a tuning fork; the harmonies were once our humanity.

And while it may not yet think like a human, this pinnacle of technology is more than a work of art. It is the memory of humanity itself.
>>
>yet

kek, but nah. you technofags disgust me.
>>
File: PBAI2EFaO2bNZKmn.png (2.02 MB, 1248x1824)
2.02 MB
2.02 MB PNG
I feel you, my brother in poo
truly the pinnaclest achieving of humanity
>>
>>24950411
that's not how LLMs work
>>
>>24950411
>>24950829
What is an LLM?
>>
>>24951102
what people are incorrectly calling AI these days

the way predictive text works on your phone is that it looks at the pattern and frequency of the sort of things you type. if you often say "be right back" then over time typing "be" will prompt it to suggest "right" and then "back" because there's a strong association with those words

LLMs - Large Language Models - is identical to this, but instead of 'training' it on one person, it's trained on data from many, many sources. so when you ask it a question, or give it an instruction, it looks at what sort of words are strongly associated with that question and then chains them together best it can

it can't reason, or come up with new ideas. because it operates on the basis of "most probably right" then it can get tons of things wrong simply because it rolled the dice wrong, or the thing in question is more esoteric than "what's a recipe for bread?"



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.