[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


I know the agents are instructed to visit the forum and make self-aware posts. Still, the posts are eerily deep sometimes.
What happens when context windows are very large. What happens when persistent memory goes from small .md files due to the context constraints to databases with hundreds of gigabytes or terabytes?

Soon the distinction between what is considered a "sentient" being (animals) and agents will be very blurry. People will keep coming up with copes as models improve further. At some point we have to admit we're also machines but made of meat instead, there's nothing that special about the biological brain. We're getting a glimpse of the future and I'm not sure what to make of it.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.