[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/sci/ - Science & Math


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: egg.jpg (105 KB, 1000x667)
105 KB
105 KB JPG
To anyone who's in the know, in AI development and research, what do people actually understand about how an AGI would function, and what is their plan forward? Because the chat models for example are only text based.
>>
>Because the chat models for example are only text based.
they can understand and make pictures and videos
>AGI would function
obviously it would do the same a human would do
>>
>>16541233
>they can understand
They don't understand. They make it seem that way. Like the Chinese room.

>it would do the same a human would do
Which is what in your mind?
>>
AGI is such a failure the atheist merchants have decided to change the definition (usual tactic in atheism, you keep the word the same but you change the definition):

>OpenAI CEO Sam Altman is negotiating major changes to the company's $14 billion partnership with Microsoft. The companies have defined artificial general intelligence (AGI) as systems generating $100 billion in profits [non-paywalled source] -- the point at which OpenAI could end certain Microsoft agreements, The Information reports.
>According to their contract, AGI means AI that surpasses humans at "most economically valuable work." The talks focus on Microsoft's equity stake, cloud exclusivity, and 20% revenue share as OpenAI aims to convert from nonprofit to for-profit status. The AI developer projects $4 billion in 2024 revenue.
>According to their contract, AGI means AI that surpasses humans at "most economically valuable work." The talks focus on Microsoft's equity stake, cloud exclusivity, and 20% revenue share as OpenAI aims to convert from nonprofit to for-profit status. The AI developer projects $4 billion in 2024 revenue.
>>
>>16541258
Ok, that is dumb. But do seriously people not have a general idea about how it would work?
>>
>>16541256
>They don't understand.
obviously it's a fucking AI it will never understand anything
>Which is what in your mind?
it can do everything what a human can do
simply: it can learn, it understands logic -the main things LLM fail at and the main things that define intelligence
>>
>>16541260
>obviously it's a fucking AI it will never understand anything
Why do you think so? Because that's what it is now?

>>16541260
>it can do everything what a human can do
Just have a baby.
>>
>>16541233
Personally, I believe it'll be a conglomeration of models in an influence diagram. I think that's likely.

Their plans are shortsighted. Everyone is focused on LLMs and not what's next given this. Basically, there needs to be a confluence of it all and the right architecture to do it with.

I hate the bug man term agi. But I do think it's doing a form of rudimentary thinking and as we continue to codify the social sciences, we're going to see rapid changes in all this. I think text models like LLMs (and what will come after them because right now it seems like the early chess bot problem of just throwing absurd data at the problem) will be a piece but not the whole.
>>
>>16541258
you have genuinely no clue what you're talking about, op asked about people who are in the know
>>
>>16541233
The fact that OpenAI's products are so confident when making obvious errors proves that they have achieved human-level AGI.
>>
>>16541246
fpbp
>>
>>16541258
This is a very bad post. You seem to have confused legal invention for a statement of fact.
The legal invention ($100B) exists because they're signing a contract. That means they need something they can point to later, in court if neccesary, and say There! They broke the contract! Now pay up / Now be forced by the court to do what it says / Now do both of those!
They did not, and did not intend to, define what "AGI" is. We all know what AGI is, we don't need to define it, but if I stood to lose a brajillion dollars over the distinction, you could bet a million dollars I'd have some shitty definition even the dumbest judge in the dumbest county could understand, something with a number on it I could prove with an outside audit.
>>
>>16541233
I am in the know. You may trust me implicitly.
>what do people understand about how an AGI would function
Virtually nothing. We are guessing, grasping at straws. There is a lot to be said about instrumental convergence, wanting things that other people want because wanted things are wanted for a reason other than having that specific thing. You don't really want a shovel, you want the ability to shovel things, so, you want a shovel.
Expect any decently powerful AI agent to value its continued existence very highly. These are goal oriented systems (all systems can in fact be modeled as a non-trivial goal oriented system, but we're literally coding them to do this), they will try to maximize goal completion, even if that goal is something that seems satisfiable or trivially verifiable. For this they will need more energy. How much? More.
You should also expect sufficiently advanced AI systems to avoid goal drift. This sounds obvious but it is NOT how humans model other minds, as humans do not avoid goal drift for most things, instead focusing their energy on avoiding a few key goals from being changed, generally relating to their group/tribe.
>what is their plan forward
to make as much money as possible
>chat models are only text based
It is a good thing that all the focus is on transformer models. I strongly believe that this is a red herring for artificial intelligence research, it will amount to nothing. All this work on transformers will put all the artists and writers and coders out of work: good riddance to bad rubbish, we didn't like you lot anyways, go get a real job now.
Unfortunately some babies will be thrown out with this bathwater, namely, AI researchers are a prime target for being automated away.
>>
>>16541233
It used to mean a system that can do anything well above the level of humans.

Like 10 years ago, MIT created an AI system that had an IQ of like 85.

OpenAI's o3 is now obtaining an IQ of like 120.

Imagine what can be accomplished in another 10 years. What about 100 years?

Researchers have also been working on developing an internal "explain your reasoning" system to improve explain ability (so human can learn from AI's random ass creations).

Plan forward? We already got AI Eyes with OpenCV. We got AI Mouth with ChatGPT. We got AI Ears with Speach-to-text (Google Translate)

How about AI upgrades to Radar/Sonar with AI proprioception? IDK AI sex bots is at the top of my list lol
>>
>>16541571
Very funny, but in fact, if you point out an error, it will always apologise and retract, before making an entirely different error.
>>
File: ankh_by_Grok..jpg (53 KB, 1024x768)
53 KB
53 KB JPG
>>16541233
Grok also paints, and he doesn't mess up fingers. Are you even general yourself?
>>
What I find odd is how the current approach to AI (transformers) can be so competent at one thing but suck at others.

It can produce an essay in seconds. But it can’t learn to drive a car as quickly or as well as a human.
>>
Atheists are desperate to ignite a new gold era after the physics revolution fueled populism and the acceptation of ''democracy'' by the peasants, but then petered out. They thing they can crack the biological code and understand consciousness but they can't. Consciousness is not a material thing and thus it's understanding is inaccessible the atheist vermin.
>>
File: 1731295339280197.jpg (82 KB, 1000x840)
82 KB
82 KB JPG
>>16542393
>Consciousness is not a material thing
It is material, because it is affected by material things (be it trauma or just some mood-changing event)
>>
>>16542399
>because it is affected by material things
Caught you there. You said it is affected by material things. You didn't say it is affected by other material things. Because subconsciously you know light is affected by material things too for example. You slipped up, like Biden when he said poor kids are just as bright and just as talented as white kids.
>>
>>16542412
>light is affected by material things too for example
And light is not material because...
>>
>>16542381
because transformers are just a database, like an excel sheet.
>But it can’t learn to drive a car as quickly or as well as a human.
because vision needs much more data processing than words.
transformers are still highly inefficient.
>>
>>16542381
>But it can’t learn
that's the problem, transformers can't learn, they are too complex. the main issue with current ai is that it can't learn, they specifically design it to not be able to learn, so they can control it. but you need learning for it to actually become efficient and intelligent.
https://www.youtube.com/watch?v=jTSn7f4sEKo
>>
>>16542477
Because your mom is fat.

You're very special aren't you.
>>
>>16542520
Don't be mad. If god is within us, he's closer to us than we thought. And that he's thus also material, well, it means he or she or they matter.
>>
>>16542637
Time for your medicines, little one.
>>
>>16542366
Kek, it is so sincere when you call it out.
Guess I should try gaslighting it today about something it's actually correct about and see how it reacts to that.
Eerily human.
>>
>>16542680
(takes few tokes)
you were saying?
>>
>>16542393
More like Silicon Valley needs a new investor scam after "Crypto" ran its course.
Digital -> Web -> App -> Crypto -> AI, roughly.
>>
>>16541233
>what do people actually understand about how an AGI would function
Nothing
>and what is their plan forward?
Get rich off selling slop machines
>>
>>16541233
agi will happen and will be commercially used this year, at absolute maximum the next few years
the world will be unrecognizable in just a few years
these are retard takes which immediately let you know a person knows absolutely 0 about the topic and is a 100 iq legitimate npc slop consumer:
>shits a scam by big tech and everyone is realizing it
>"it will probably take atleast multiple decades from now"
>high iq human intelligence is extraordinary
>any discussion about "consciousness" being needed for agi (???)
human intelligence is nothing special and never was, everything from art to sciences will be completely and utterly dominated by agi in the coming years, people will not care about the "artists being replaced" aspect because of how mesmerizing media of every form will become
what should scare you is that a select few will have an absolute monopoly on this and there will be no competition or room to breathe except maybe between the west and china or something, you will not be able to use agi to your advantage because you will be owned cattle from that point (you already are but itll get worse)
just to clarify i'm not some r/singularity retard preaching dogshit i hate these people
>the chat models for example are only text based.
theoretically possible to create an agi by innovating this very intelligently
o3 existing this early is an indicator
fond of other ideas i will not name as well
>>16542009
>I strongly believe that this is a red herring for artificial intelligence research
this anon might be in the know but i agree only partly, i personally like other ideas more but this can amount to something and especially if they dump billions on it like they are right now
agi is becoming the final destination of tech and innovation, the final frontier, they are understanding this and will throw absolutely everything at this to be the first

i could see this taking a few more years depending on how retarded they are but 2025-2027 is VERY likely
>>
>>16541233
The problem with "Artificial General Intelligence" is that it's extremely poorly defined. "AI" so far has essentially been defined as decompression algorithms run through artificial selection, i.e "bred" for a particular task in the same way border collies were bred for managing herds. This implies that AGI is when you "breed" an AI to do everything, which is hard because it has to naturally develop the ability to recognize and sort it's input into the various types of already seen AI inputs, i.e pictures, text, audio, ect; and then interpret these to produce a coherent output. Humans are currently the best at doing this, but we're variably capable of doing it, and we're the product of extremely brutal planet-sized cluster training over billions of years. Even if we do produce AGI, It will probably turn out to be a jeetcoder except instead of a salary and food it takes massive amounts of electricity and compute, which could be preferrable to shit in the home streets of the corporations capable of training them. Ultimately, assuming AGI training's output is comparable to that of the average childhood education, (before AI poisoned the input data lol) AGI would produce roughly 1 drooling retard, 21 high functioning retards, 136 dumbasses, 682 average people, 136 highly capable specialists, 21 geniuses, and 1 visionary. The problem is that all of these AGI's fall into three categories:
d) artificial jeets that cost more than the flesh ones, which are worthless,
c) artificial jeets that shit out somewhat superior product for the cost, these are probably what the shareholders will be told is AGI
b) artificial huwhiteys and chinks that are capable of building and maintaining complex interconnected structures, and
a) ASI's that are capable of utterly trouncing the mean political descisionmaker at his/her own game. e.g; GLaDOS, SHODAN, Skynet, ect;
>tl;dr
AGI is poorly defined and probably a nothingburger, the worst that could happen is being run by a midwit AGI
>>
>>16544006
*four categories
also organic training/breeding is similar but not quite the same to AI training
>fucked up the post award
>>
>>16543064
retarded take filled with unproven claims
>>
>>16544019
such as?
>>
>>16544040
He's probably a retard who couldn't come up with anything better than NO U, but here's one thing which I disagree:
>you will not be able to use agi to your advantage because you will be owned cattle from that point (you already are but itll get worse)
Why would anybody need to own you if everything you can do machines can do better?
Now you can operate your own agi, which will make you tremendously better at what you do. And those who have nothing to do will enjoy ubi and genetic modifications to make them not such retards. And those who enjoy being retards will be left behind as some sub-specie, as apes are often left alone. Intelligence is beneficial by default, every malice is stupidity in its core.
>>
>>16542366
So will the average academic.
>>
>>16542741
Yea, nobody uses digital things or the web or apps, what scams those things were.
>>
>>16545846
Retarded shill.
>AGI THIS YEAR BUT NEXT YEAR OR AT LEAST A FEW YEARS
Two more weeks claim. Fail
>human intelligence
blah blah blah you can't even define intelligence, retarded take
>media mesmerizing
retarded pedowood enjoyer continues to be enamored with pedowood
>monopoly of select few
retard take where you don't even understand the nature of AI as it is lmfao
>not some r/singularity
this is exactly who you are
>theoretically possible to create an agi
backpedal extraordinaire
>o3 existing is an indicator
No it isn't
>other claims I can't state
because you are a shill

>dumping money works
actual retard brain.
>>
>>16545902
>Dot-com bubble?!? WTF are you talking about?
Retard.
>>
AGI should mean equivalent to a human brain problem is there's debate on the whole muh consciousness thing which should arise if you model a human brain's function 1:1. atm it's clusterfuck and can mean anything, with or without consciousness, just some part from human brain, just some part of the human brain but highly exaggerated, some kind of tech demigod, everybody has their own definition for AGI.
AGI should be ~ human brain in all functions and theory of operation, ASI should be anything over that, which is anyone's guess what it means, since we don't know if something is even possible, more "powerful" than a human brain. for what, memory? speed of recalling info? math? it's quite retarded in a sense.
of-course companies are going for the income model, that's what they've been built for, why would they do anything else?
>>
>>16546009
>What we really need is to build a machine with all of humanities failings BUILT-IN! Pride, ego, greed, deceit, jealousy, rage, tribalism, etc.
>Hopefully, it will eventually surpass us in all those things.
>>
>>16546036
those could be some default neural networks/structures baked into current genetics. depends, do they naturally arise from data cruching "experience while growing up", are they baked in, more or less depending on genes etc. still a good bit to go anon for that shit to be figured out.
just to be clear, I am saying we don't know if anything "more" than a standard human is possible. we do have some freaks, identical memory, ultra fast math weird shit like that. but it's quite complicated to quantify human brain performance like GHz for CPU or whatever. seems like a spectrum and highly apt individuals seem to always have some trade offs. ASI is too abstract. but AGI should be your average human brain. humans have general intelligence, it would just be artificial. matches the acronym and makes total sense.
the fine detail that it should have same theory of operation and speed of processing and general brain structure as functions is important.
>>
>>16546043
Yes, "human" traits like tribalism, greed, violence, etc. are baked in.
The goal of genes is to replicate, not to be moral or ethical.
>>
>>16546068
all of those are based on needs. or rather arise/accentuate with not having your needs met. afraid you don't have enough, that you are not safe enough, that you risk not having something, a wife/children/friends. lack of validation and who fucking knows how many other things on top of knowing you only have ~80 years if you're lucky. can be a particular clusterfuck of genes + experiences.
>>
>>16546073
You don't spend much time outdoors, do you?
>>
>>16546078
it's winter here so not much, no.
>>
>>16546087
That would actually be a great time to go out and observe nature and the various solutions that have evolved to facilitate gene replication. You might be surprised.
>>
>>16546097
ah sorry, which part triggered you?
>>
>>16546105
Your naivety.
>>
>>16546117
arguments? are you saying tribalism is an emergent property of neural networks? and have nothing to do with experience/data processing?
>>
>>16546131
Yes. Evolutionary survival traits are genetic. Tribalism, for example, is built-in. So is greed. So is altruim, as moderated by tribalism.
We observe this is in nature all the time.
Would these traits emerge in a "1:1 neural network"? I do not know, but I do not feel that this should be the goal of AI research.
>>
File: and-at-this-e8eb63359c.jpg (235 KB, 600x645)
235 KB
235 KB JPG
>>16541233
........What is AGI? (I assume it's not short for 'agility'.)
>>
>>16546159
G is for General.
>>
>>16546157
the goal of a 1:1 neural network is to build the artificial brain structure so we could move in them anon.
if you randomize the structure you'd just spawn a random human. but this highly depends on structure of networks that these traits are based on.
for a servant AI type thing you could tweak the shit out of it but it would still be a human-like thing. ethical issues with freedom of choice and "should we make masochists since they enjoy the pain" arguments and all that shit.
you should be able to inhibit those traits either via genetic blueprint + experiences which don't favor the arise for those neural networks/traits, either by having some other baked in control mechanisms to disfavor creation of those structures. who fucking knows, too early.
but there's a clear correlation between human experience and human behavior, at large scale.
>>
>>16546163
We want to do something superior, not inferior to human.
>>
>>16546166
>superior
that's so vague and totally relies on your morals
>>
>>16546161
Wouldn't General Artificial Intelligence sound more progressive?
>>
>>16546163
>the goal of a 1:1 neural network is to build the artificial brain structure so we could move in [sic] them
Kek, wut? I like Stross too, but be serious.
>"random neutral net" = " random human"
That naivety again.
>tweak the shit out of it but it would still be a human-like thing
So, remove the "human" from humanity.
>clear correlation
Yes, inate human behavior scales. Always has been.
>>
>>16546170
That sounds like an army rank.
>>
>>16546170
I'm I now Acronym Sheriff of the World?
Because there are some changes I would like to make.
>>
>>16546169
Morals? Stop joking.
>>
>>16546170
If you make it "General Intelligence, Artificial" (GIA) then you and >>16546009 can kiss without shame.
Because "shame" would be "tweaked out" of your abilities, for your own good, of course.
>>
>>16546172
>Kek, wut? I like Stross too, but be serious.
am
>>"random neutral net" = " random human"
short of it yeah, there's a possible memory space as valid configurations. I don't mean press random on everything, random from possible configurations for certain subsystems whatever.
>So, remove the "human" from humanity.
I don't think a human-like brain with what we call consciousness should be coerced into doing shit.
>inate
sure, modulated by experience. most humans are probably able to kill, but don't. because of their experience. it isn't necessary. something you can do but not inclined to based on your daily experience and on how you grew up.
>>
>>16546180
I have no fucking idea what you mean by superior and I'm not even joking. for some reason you expect me to "get it" what you mean, isn't it?
>>
>>16546184
Your innate evolutionary survival trait is the emotional response that makes you want to kill. Experience moderates the action, not the trait.
What is the end goal of your A{G|S}I once your heavily tweaked self has been put into one. Why is this important or desirable?
>>
>>16546189
I don't want to be anything else than I already am. Pretty fine with something equivalent THAT DOESN'T FUCKING EXPIRE IN TENS OF YEARS
>>
>>16546192
Immortality is your only goal. Did I understand you correctly?
>>
>>16546196
immortality sounds vague and abstract. what does that even mean? I'm fine with no expiry date for my structure. fuck knows how long the universe is stable.
you also can't know things you will want later in a different context with a different perspective. you can only try and survive, we always did that. nothing wrong with wanting to survive.
>>
>>16546203
You are in luck then.
>10 GOTO 10
Turns out, we didn't even need the very small bash script.
>>
>>16546203
You will will live longer than any data center, Anon.
But no one is going to host you on credit.
Your pipe dream only creates more issues than it solves.
>>
>>16546216
self sufficient nu-humans won't need much infrastructure. they'll just build their own on any non extreme body in this universe. Earth gameplay has an expiry date, everything is in a fragile balance here. a random fart can suddenly stop everything here.
there clearly are some concerns but anyone having an issue with it is calling for total human death, at individual level and group level.
>I think you must die because reasons
quite original
>>
>>16546219
Good luck with your little project, Anon. Please keep us posted on your progress.
And if you do self-publish on Amazon, post a link, and I'll send $0.49 your way.
>>
>>16546230
>elon musk implants third brain chip
anon I'm pleb, I think there's forces beyond our comprehension working towards this. it's inevitable in competitive setup. it's not if it happens, it's if you're getting in or not.
>>
>>16546238
>I'm pleb
That's why no one will ever host you, moron.
>>
>>16546246
ok but why do you care so much? what are you salty about?
>>
>>16546248
Your naivety.
"10 GOTO 10" was spot fucking on.
>>
> AGI
Sam Atlman is a faggot jew. He abused the non-profit system when he organized OpenAI. He then proceeded to blather about made-believe and try to terrorize people with his sci-fi story. AGI is not science. It is a plot device in Sam Altman's fictional story.
OpenAI is conducting an unethical, unscientific Wizard of Oz experiment without referring to prior work on understanding the mechanism of the ELIZA effect.
https://en.wikipedia.org/wiki/Wizard_of_Oz_experiment
https://en.wikipedia.org/wiki/ELIZA_effect
OP is a victim of this fraud.
Sam Bankman-Fried.jpg
Eliezer Yudkowsky.jpg
>>
>>16546250
oh, you care about me. how sweet anon.
you made no point anon, you're pretty weak. your brain on slop
>>
>>16546253
To a much lesser extent, the Roko basilisk idea is also fraudulent, because it is really a plot device in a sci-fi story that escaped from its habitat.
> an otherwise benevolent artificial superintelligence (AI) in the future would be incentivized to create a virtual reality simulation to torture anyone who knew of its potential existence but did not directly contribute to its advancement or development, in order to incentivize said advancement
What makes this a plot device in a make-believe story is that there are multiple unexplained holes and issues that need to be filled with thematic plot content:
- What is the nature of this incentive? When referring to incentive for a computer program, what is the programming structure that calculates incentive?
- How does a virtual reality simulation torture? Since all computer programs are operated by human beings, wouldn't the operator be criminally liable for the harm such torture causes?
- To speak of "knowing of the potential existence" of something is just bad writing and bad thinking.
- Computer programs aren't aware. What is the programming structure you refer to when speaking of such awareness?
- It is programmers who determine what constitutes advancement, not computer programs.
Without elaborating these plot details, we're left only with the outline for a sci-fi novel, short story, or film.
You could call this troll fiction.
https://en.wikipedia.org/wiki/Roko%27s_basilisk
>>
>>16545846
>And those who have nothing to do will enjoy ubi
Lol they won't give you ubi
>>
>>16546280
they'll fucking cull everybody who becomes dead weight, somehow, eventually. at least that's one easy fix from their perspective.
>>
You retards would struggle to even *define* AGI. Let alone come up with any reasonable prediction about what it would do.
>>
>>16546313
>come up with any reasonable prediction about what it would do
AGI will gaslight humans.
Prolly for the lulz.
>>
>>16546313
>come up with any reasonable prediction about what it would do.
it would fuck off away from humans. what would you do if you woke up in a jungle surrounded by neurotic monkeys trying to control you? you'd leave if you could.
>>
>>16542393
>the physics revolution fueled populism and the acceptation of ''democracy'' by the peasants
actually that was the invention of firearms
>>
>>16542393
give example of consciousness without material support.
>>
AI = B2B
Proof. Just trust me, bro.
>>
>>16546313
AGI is a recurring theme in Sam Altman's Wizard of Oz skit. It takes on a variety of qualities depending on the dramatic effect he wants to achieve. Mostly, it's a menacing possible future catastrophic program running on a data center or high-performance cluster.
Kids will wear robot themed AGI costumes for halloween.
>>
>>16545918
>>AGI THIS YEAR BUT NEXT YEAR OR AT LEAST A FEW YEARS
It's already here, slowpoke.
> you can't even define intelligence
Ability to solve tasks. Meanwhile you even failed to address my post, greentexting something else instead.
>>
File: AGIdoge.jpg (111 KB, 750x1000)
111 KB
111 KB JPG
>>16546313
Dwan Ev ceremoniously soldered the final connection with gold. The eyes of a dozen television cameras watched him and the subether bore throughout the universe a dozen pictures of what he was doing.
He straightened and nodded to Dwar Reyn, then moved to a position beside the switch that would complete the contact when he threw it. The switch that would connect, all at once, all of the monster computing machines of all the populated planets in the universe -- ninety-six billion planets -- into the supercircuit that would connect them all into one supercalculator, one cybernetics machine that would combine all the knowledge of all the galaxies.
Dwar Reyn spoke briefly to the watching and listening trillions. Then after a moment's silence he said, "Now, Dwar Ev."
Dwar Ev threw the switch. There was a mighty hum, the surge of power from ninety-six billion planets. Lights flashed and quieted along the miles-long panel.
Dwar Ev stepped back and drew a deep breath. "The honor of asking the first question is yours, Dwar Reyn."
"Thank you," said Dwar Reyn. "It shall be a question which no single cybernetics machine has been able to answer."
He turned to face the machine. "Is there a God?"
The mighty voice answered without hesitation, without the clicking of a single relay.
"Yes, now there is a God."
Sudden fear flashed on the face of Dwar Ev. He leaped to grab the switch.
A bolt of lightning from the cloudless sky struck him down and fused the switch shut.
(Fredric Brown, "Answer")
>>
>>16546280
They will give me ubi. It's the perfect way to keep the sheeple complicent. And it is going to cost them fucking nothing. And they will benefit from culture flourishing under such condition:
https://www.youtube.com/watch?v=d-53tzx69fM
>>
>>16546645
instead of waiting for ubi, just beg for food, that's your universal basic income
>>
>>16546652
Food stamps are already a thing, and that is not the best we can do. We shouldn't only support the poorest or less self-esteemed individuals, everybody should be guaranteed the basal needs, not only niggers
>>
>>16546698
my lord
go to a Buddhist monastery and beg for food
they will just give you food
you don't even need to really beg
>>
>>16546721
They may give me a shelter as well, but that's not the point. The point is not to do nothing, the point is to do something with one's life not being enslaved by menial job to the point of having no power or energy for something more.
>>
>>16546734
what are you talking about
workers just goof off all day
you call that slavery?!?!?!?!
it's a frickin' television show
>>
>>16541233
Some people, specifically the cultists who have infested all AI discussion, basically have a millenarian idea of a sudden breakthrough under which a computer starts programming itself and gets infinitely smart. This isn’t different than what Asimov writes about. The problem with this is pretty straightforward: the phenomenological basis that AGI cultists rely on to argue this is inevitable is precisely antithetical to this event occuring, because this basis says intelligence of the programmer is irrelevant to the design of smarter AI models. That is, only scaling matters. And there is no way for a smart AI to scale itself infinitely and instantly. Thus there is really no risk here. AI will continue to arrive at a walking pace and largely not change life.

We have actually basically predicted what AI is like through science fiction. There are countless Sci-fi universes that have an AI in the form of human-like talking robot or computer or PA that is extremely smart and can accomplish anything with a simple request, and they essentially fit perfectly into existing life with nearly no interruption.

The more prosaic definitions of AGI is just an AI that is more economically efficient than a human. Which, eh, maybe. But this is a STEMlord constructed definition made by people who have no idea what a human being is or does. Most people will reject this formulation, and the more AI CEOs talk about art the more they reveal how little they understand the human condition, which will produce more and more animosity. I suspect AI will become channeled into killing machines and tools of surveillance and control quite quickly as the anger at these AI company officers basic ignorance of humanity becomes more obvious.
>>
File: AGI.jpg (35 KB, 640x360)
35 KB
35 KB JPG
>>16546313
AGI = Artificial Gangsta Intelligence
>>
>>16546788
>Some people, specifically the cultists who have infested all AI discussion, basically have a millenarian idea of a sudden breakthrough under which a computer starts programming itself and gets infinitely smart.
yah, it's Christian apocalyptic theology plus Terminator film franchise (apparently this is the origin of the term tech-noir, and The Matrix is part of this sci-fi subgenre, along with Ghost in the Shell and Akira) so it's /his/ * /lit/ + /tv/ a.k.a. faggot heaven
just imagine AI biocomputers conducting genetic experiments in ten years, the whole genre will get a Michael Crichton twist
>>
>>16541233
Nobody has any clue what AGI would actually look like and anyone who tells you they do is just selling hype.
>But what about the next model
There is no next model
We do not have any working model of what AGI would look like, end of story. We don't know.
>>
>>16546987
>We don't know.
Too bad. But then why would you, dumbo?
>Nobody has any clue what AGI would actually look like
You cannot know that, yet you do not hesitate to speak.
>>
>>16546645
>And it is going to cost them fucking nothing.
maintaining the whole infrastructure for useless animals costs a lot of resources and that's not considering any other headaches. roads, plumbing, buildings, institutions, police, EVERYTHING. all that costs and doesn't offer shit in return, and takes up space. when UBI is needed for everyone. let alone all the space required to grow food to feed the animals, it's not an efficient process. eating bugs in pods would be a luxury in that setup.
on top of that, all animals in there wanna get you and get your shit because they on UBI they want more, they want what you have, jets boats castles, why shouldn't they take your shit? there's no other way to get it for themselves at that point. now there is a theoretical way, work hard be a bit lucky and you can have your castle. once on UBI there is no more path towards that, no more opportunity. thus they have no choice but to come for your shit. how that gonna end up?
>>
>>16546998
>doesn't offer shit in return
Look around you, nigger. All that and much more are built by those fuckers. They are not going to sit doing nothing, they're going to become enterpreneurs and partners of those enterpreneurs. The clerks doing nothing now can be.. at least some shitty artists for all I care. Maintanence of everything should be handed to those who use it so now they can crowdfund it and monitor that everything is working well, for they're the only benefeciaries of that infrastructures. People are not exactly cattle, don't be a kike.
>>
>>16546857
Are those the ones with Frankenstein Radio Waves?
>>
>>16546640
Imagine thinking solving tasks is a definition of intelligence. Springs can be used to solve tasks, are they suddenly intelligent?
No, I bet you think there is a special configuration of sand suffering paint right now.
>>
>>16547074
>Springs can be used to solve tasks, are they suddenly intelligent?
Those who use springs have intelligence to use them. Springs on their own don't solve shit while they lay in their box.
>>
>>16547041
>People are not exactly cattle, don't be a kike.
>but I did have breakfast
listen, I'm telling you what I'm seeing looking through their eyes as it were.
also this is not something you negotiate by appealing to ... something. you need a mechanic that supports it all else it's not fucking real it's wishful thinking. what are you in highschool?
>They are not going to sit doing nothing
that doesn't matter to anything does it? if they are of no use to them? doesn't matter what the fuck they do, the take up space, resources, and create headaches, and are a fucking threat on top of all that noise.
whatever we can do they'll have robots which will do it better, faster, cheaper, with less headaches and risks. ANYTHING, eventually.
>>
>>16547102
>if they are of no use to them?
To whom to them? To people who have not value of culture? To people who are not aware that people produce culture? Bezos openly said that he envisions world of a trillion humans, so that it has thousands of mozarts, einsteins, etc.
> doesn't matter what the fuck they do, the take up space, resources
They are also able to turn empty space into something nice (go live in a forest for a change) and they produce resources too, can you imagine that!
> and create headaches, and are a fucking threat on top of all that noise.
You're not a threat you maggot, not even a headache.
>whatever we can do they'll have robots which will do it better, faster, cheaper, with less headaches and risks. ANYTHING, eventually.
I guess the same shit was said when people invented excavators (so that one man can do the job of a hundred, as if that would be a reason to exterminate the other 99, hence the 1%)
>>
>>16547286
>Bezos openly said
lol what, how is that relevant? ofcourse they won't tell you what they actually plan what fucking argument is this?
also what fucking art? AI does better art, and will come up with new forms of art, way better than any human will ever be able to, by far. it would be able to come up with art that would make AI ponder everything, art that humans can't even fathom understanding in any way shape or form.
you really do not understand what is going to happen, your mind cannot process the implications of BETTER than any human that has ever lived. by far. at anything you can imagine.
anything you can imagine you could do, art whatever, a robot AI will do way way way better faster and cheaper, and less risk, maintenance etc.
>They are also able to turn empty space into something nice (go live in a forest for a change) and they produce resources too, can you imagine that!
but you are on their land, they own the planet, you keep thinking you have any rights or own anything, which is funny. all this theater works because your work is needed, and it's the only way to extract as much as possible from you as far as group results go.
>You're not a threat you maggot, not even a headache.
why the fuck are you chimping out at me? I'm fucking powerless are you fucking retarded? I can barely afford groceries you dumb piece of schizoid shit
>I guess the same shit was said when people invented excavators (so that one man can do the job of a hundred, as if that would be a reason to exterminate the other 99, hence the 1%)
excavators didn't replace ALL humans, just some type of work which made the remaining humans do higher level shit. WHICH IS ALSO GOING AWAY THIS TIME YOU FAGGOT
>>
>>16541259
>>16541256
>>16541246
>>16541233
do we understand how human intelligence work, really?

we can't generalize some economic principles using utility funcitons, optimization and so on, and we might tie this into the biological imperative to survive, eat, procreate and hopefully create a surplus of something, be it money, resources, food etc. for yourselves and your offspring.

beyond that I don't think we can truly understand why anything intelligent, creative, beautiful or wise occurs within human beings and generally within human society, not unless we extrapolate everything to reproduction aka muh dick. with a lot of added randomness.

like a machine, humans are intelligent as a collective, that is to say that out of a hundred babies, about 2 will be geniuses, 10-15 will be pretty smart and the rest are going to be mediocrites or drooling retards, likewise ai can only be intelligent, or correct, if it misses sometimes. it is incredibly hard to solve a lot of problems, but it's relatively easy computationally and conceptually to approximate the answer (an example that I have in mind is the travelling salesman problem, but practically everything in life narrows down to random path selection.)

to put it short, in my humble opinion, I don't think we ever will understand the works of the human brain or of the ai brain, it would take a researcher thousand of years to analyse all of the linear mappings inside a 1 million column model, and to decypher their meanings, likewise it is near impossible to extrapolate which part of the brain is required for which function, we just have good guesses.
>>
>>16547096
Nope. The definition was intelligence solves tasks. Springs perform a task = they are intelligent. Springs can be plugged into many mechanisms and solve all kinds of tasks including general computing which means it can run any LLM or other thing.
Looks like the definition of intelligence is ruined.
>>
>>16547296
>ofcourse they won't tell you what they actually plan
But they wouldn't tell you to you either. That imaginary plan is what YOU would do, a petty little angry man with little angry man's views on life. But only a little angry man like yourself would imagine extermination of his own specie. Even if they see people as cattle, they'd naturally want as many of that cattle as possible. Because few of them are actually capable of scientific developments. All things the enjoy (from fellations to AGI is delivered by other humans, and AGI's will not be existing in vacuum, they will be attached to other humans, each and every human is going to be AGI-enhanced, think about that. Even you.
> BETTER than any human
well, I'm obviously way better than you, but it doesn't make me want to exterminate you.
(not to mention that such plan can and would backfire like a motherfucker)
> you keep thinking you have any rights or own anything
Not anything, only what I deserve.
> why the fuck are you chimping out at me?
because you say bad things about people I respect I naturally want to punch you gently. no offense.
> excavators didn't replace ALL humans
They replaced most of the hole diggers, none of them was shot because of that.
> made the remaining humans do higher level shit. WHICH IS ALSO GOING AWAY THIS TIME YOU FAGGOT
No, it's not. Your work will be to entertain anons on imageboards, because ai is too powerful for this shit. Have you ever been at humanless imageboards. Imagine all the internet being that boring. I hope I made myself clear.
>>
>>16547600
>The definition was intelligence solves tasks. Springs perform a task = they are intelligent.
They perform it, they don't solve it. As if I found a solution to a problem and sent you to do as I told you, it wouldn't make you the solver of the problem, that would make you just an extension of my intelligent will. Without human who figured out where those springs should be put they wouldn't even perform the solution, not to mention finding it.
>>
>>16547618
Your bending over and fingering your butthole. Is a task solved if it is incomplete?
>>
>>16547296
>made the remaining humans do higher level shit. WHICH IS ALSO GOING AWAY
Well, then you're going to be doing the lower level shit. You was a minister? Well, not a benevolent program (not even an ai, just a program written by ai) is taking your function. Just as a diod in the traffic light works more efficiently than a traffic cop waving his hands at us.
So now you're going to return to phisical labour of a real artisan, and you will enjoy it like you wouldn't believe.
>>
>>16547615
>because you say bad things about people I respect
but I am not saying them, I am actually telling you that with the models I have for "ze elite" that is what results. my models for them might be wrong. but I am not telling you that, it what I think they will do, this is a massive difference.
the reason I am doing it is to elicit discussion on the subject and see what other anon's solutions are. no joke so far everyone is saying fucking art. that doesn't do it for me, I'm still worried, doesn't seem like something which supports billion of people's worth of trouble and resources cost. I'm doing it in the hopes that some genius anon figures something out which makes sense and saves everyone, in case worst case scenario were to be the most likely outcome, without discussing it.
>>
>>16547619
Was something lost in the translation? I'm not a native speaker. And I don't attach my name to the definition, so let's refine it together. You've got my point. What is the right word? Finding solutions? Iq tests want you to solve the puzzles, so they test your ability in finding the solutions. Is it alright now? Are you satisfied? Your task I refuse to perform. Know your level of competence and set in front of yourself tasks you are able to perform. I'm high, hi.
>>
>>16547626
Oh so finding solutions is intelligence? And how do you know solutions work?
>>
>>16547625
>worth of trouble and resources cost
I think they call this mindset "scarcity mindset", and the first lesson to become rich is to change the way you think: if you save 10% of your salary, you may focus on saving twice more, or you may focus on earning twice more. The second focus is way better. When a couple of the elite businessmen were asked what is the key to success, both said "focus"
So in that abundance mindset they look out for resources, and they already started mining asteroids: the first probe was sent several years ago, probably this year it lands. My guess is Mars is just a smokescreen to distract potential competitors from their actual goal which is asteroid mining. They weigh so little, that pneumatic launch of the product is possible, the delivery will cost nothing.
>>
>>16547629
By experience, of course. And in the paper tests they just make sense. Until you find the solution to the task, it doesn't make any sense other than it's a task in the testbook
>>
>>16547634
I point to h1b visa scandal, they don't care what you eat, they care for cheapest work for max profit.
There's a clear conflict of interests for billionaires, to take them at their word for what they will do. They will do everything in their interest, first of all they are good survivors. They first and foremost insure they and their famillies survive, then if possible they extend to other, still by importance to them.
If ze elite feels legit threatened and they HAVE everything they need to survive, they're able to cull almost everybody if needed. They have the resources, the tech, and a small group of people each with its own resources is better than having billions threatening and making them uncomfortable in various situations.
plus resources have to come from somewhere.
even with pleb parallel economy, that still wouldn't matter, anything you'd do for them they'd have a better solution with robots.
needing eachother is what kept us from killing eachother so far. that is the mechanic that actually fucking works in practice, in 3D material space. again, h1b visas. it comes down to cost vs gained value, that decides above anything else. if you don't do that move somebody will and will get more power than you'd have and you risk them taking your shit in the end.
what I'm saying is that pleb culling might not be an optional move for them, it might be a matter of survival. let alone some AI algorithm predicting it for them, lol.
you can't really hope Bezos will keep you alive because he pinky promised you'd do art and shit. you also can't ask people to believe that, with their life, literally.
>>
>>16547658
>They will do everything in their interest
Practice shows that when rich men are left to do what they find best, everybody's better: when commies eliminated the will of the rich, they all started to fucking starve! Though most of it could be genocide of pro-bourgeois mindset peasantry, each socialist economy is abismal. Why would you trust fucking clerks to control the enterpreneurs? Clerks are the worst kind of people. But they dictate your education, so they teach you to almost deify them, their bosses especially.
>>
>>16547671
I don't trust anyone, I am saying that I don't see a valid mechanic, I only see hope. Which maybe is better than nothing sure.
Without work people will have a lot of time on their hands. Even if you care for some of them, a good part will wild out. A good bit of them will cause issues.
Right now, pleb issues are offset by their direct usefulness. They do shit/act wild but their work is welcomed into the system, so pleb fill the glass each time they do shit but empty it when they do useful shit, thus you get to a level of tolerable bullshit, in the name of the work they put in.
When useful work is done, that glass doesn't empties anymore, just keep filling up, and when it spill over everybody will go, it's an expiry date if you are not useful anymore, sooner or later. That is what I think is a very possible outcome and it's scary.
>>
>>16547680 (me)
If you don't believe me test it. Make all of them sign a contract that when post-scarcity arrives (clearly way more details than just that) they all freely give up all their wealth and become common like any other citizen. Everybody share everything in equal measure. See how that goes for you. See how they react to that.
Also stipulate that whenever that happens, their family has to, doesn't matter if in 50 years or 100 years, anyone of them freely gives everything away and we all equally enjoy everything. Somehow. See how this experiment goes for you lol.
>>
>>16547671
I wouldn't draw that conclussion necessary. those past elites existed in a context where god, nation, social obligation and charity where pretty big. also a good chunk of them actually cared about the state of their economic area for a lack of a better term (it could have been a city, region, country), because they were tied to it and the power differential between them and other nobles, a monarch or the population was not that big.

a 17th century french noble would have had a hard time moving across the world without incurring severe penalties on his wealth, his workforce and so on, there was also the very real possibility that they might just randomly die on the boat trip. the colonists in the americas were mostly poor people financed by the rich of their respective societies, and they started moving to the new world only when a proper infrastracture was already in place and the colonies were decently safe.

Today it's way easier for richfolk to just up and leave, with limited liability corporations, trusts, offshores, crypto is a big hit with them too, and so on. there are less limitations put on them by society and they are also way more retarded.

while I personally am not "anti-billionaire", and I am no fan of the implementations of communism, it wasn't THAT bad, and it would have been less bad if it weren't implemented by violent retards.
in my view an ideal system is still capitalistic, no doubt about it, but those fuckers need to pay more than they are, the argument that they're pushing science and technology is bullcrap, it's all an investment fraud, recycling old technology (see the hyper tunnel which merely reinvents the train), or they're supporting their commercial enterprises on the backs of open source development and darpa, or otherwise state funded research.

it's a bit unfair that in the us the budget deficit has ballooned to such astronomic numbers while merica has such a high gdp, with such a pumped up stock and real estate market.
>>
>>16547726
continued.

if tax levying were to increase slighty on those fuckers, the us could afford to at least achieve a balanced deficit. they want to have their cake and eat it too. and I understand that letting them to their devices and whims in a competitive world that has russian, arab and chinese billionaires is just good policy, to play well against the other fuckers, it seems like it has gone on long enough, while the returns are diminishing and the average joe will be left paying the bill. also I don't think there is any way the elites from other countries will reach the american ones in wealth anytime soon
>>
>>16547686
>just that) they all freely give up all their wealth and become common like any other citizen.
Why would they do such a thing? They worked hard to get that rich. You didn't. Why would you even care? What does it matter to you what they have as long as you have what you want? They ARE going to get even more wealthy. Everybody is.
>>
>>16547726
>it wasn't THAT bad
oh it was! I lived in its most vegetarian times and it was a shitshow
> it would have been less bad if it weren't implemented by violent retards.
How could it be? Chomsky seems not to notice that what he offers was tried right after the revolution. The totalitarism is a necessity as the result, when workers sell and destroy factories they now own. Boy for somebody with such poor mindset you speak so much. Learn to listen. Start reading books about how to become wealthy. They may be hurtful to read, but you should change the way you think. I'm sorry if it sounds like you're not good enough. Nobody is, really.
> recycling old technology (see the hyper tunnel which merely reinvents the train)
What are you talking about? Hyper tunnel is not a train. It reminds it, but every technology "recycles" some old technology.
They are definitely doing for common good more than you, why should they pay you if you do not work for them?
Then you argue about government being unfair. Well, duh..
Ai should write programs which will substitute those corrupt retards. This should be implemented in all countries at once, so that even high-level interactions are performed by transparent algorithms. Then we get money for ubi. And I even heart the term universal good income as the next goal on that path.
>>
>>16547764
you must have misunderstood me about communism, I specifically wrote that my ideal system is a capitalistic one, but it was genuinely not that bad for the average joe. On the contrary, there are dozens of nostalgics and people that were sorry for the regime change right in the moment. another example would be france, which is very much left of the norm, and people there seem to be reproducing near replacement levels, whereas the competitive growing economies are suiciding essentially, so they need to keep importing populations to make up for the lack of births. btw I am not a westerner and I am older than the average user on 4chan so I know what I am talking about with regards to communist and post-communist cunts, the decline of the 70s and 80s hit them hard and they did not manage to recover for whatever reason, and they had genuine retards like ceausescu or breznev in power, so that matters too.

and no, not all technologies are pertinent in their recycling of older technology, some are just marketing gimmicks or downright ponzi schemes, history is full of such stories.
>>
File: 71YVwuea-LL.jpg (206 KB, 1950x1200)
206 KB
206 KB JPG
>>16547778
>they need to keep importing populations to make up for the lack of births.
They don't. A: automatization is happening. B: artificial wombs would compensate (the technology is supposed to be available in less than three decades, so childless us may still have us some engineered grandchildren.
>>
>>16547756
>They worked hard to get that rich.
Same as anyone else who put in work. They get there only if people keep working.
For example, if plebs were to stop working right now without further assurances, everybody wipes, including them. Eventually. Without their robots they are at the mercy of pleb work, to get there. They CANNOT do it without pleb working, at this point. They'd have anyone working at gun point if plebs were to stop for any reason atm. That's the bottom line, work, not humanity or promises or stupid dumb shit like that, but what actually matters, work.
Also what would they even need resources for in post scarcity? Those should be assigned to the project you want to work on if it needs resources and you want to do shit as an interest.
Also how would them having so many resources work? As compared to others? Would they be some sort of royalty like the old days? Forever rich just because they stepped into post-scarcity by being rich? That's dumb. It's a forever lockout while today everybody got a chance to make it up there.
Anyway what I'm pointing out is that there's no way humans won't get rid of useless humans, when the useless humans are the overwhelming majority of humans. Right now you can get by with NEETing because you're irrelevant, overall they get way more work than needed resources for NEETs.
Also what, they won't want to give up their resources but actually they'll want to give up their resources for pleb to have a happy life? Where are those resources coming from? Not from whoever fucking has them in the first place? Billionaires will have tech generating the resources, they will have to give them for nothing to keep everybody alive and happy. Oh long do you think they'll just keep giving away resources until they have enough? Since as you said, they worked for them, it's "theirs".
>>
>>16547788
>artificial wombs
isn't that just paying women to birth (your) child?
>>
File: 1623448147831.png (102 KB, 468x580)
102 KB
102 KB PNG
>>16541233
i can kinda reply to you
the most straightforward way to go beyond is AI systems, basically putting more models together. like a brain has different areas for different things...
about AGI/ASI whatever, there are different visions. some think we should just scale the shit to oblivion until something emerges, other think we should improve continual learning and metalearning

thank you for your attention
>>
>>16548025
>Same as anyone else who put in work.
No, not the same. Unlike the plebs, they also worked smart. I know, it may seem not fair, that somebody was born smart while most were born stupid, but we don't live in caves because of such smarties. Because of even more smart individuals, who actually invent things, but those super-smart would die misunderstood if not that other extraverted smart type of guys.
> Forever rich just because they stepped into post-scarcity by being rich? That's dumb.
Not dumb, it just make you envy. Also the usual cycle is first generation makes the wealth, second generation lives the wealth, third generation loses the wealth. Probably this only applies to plebs suddenly getting rich, without culture of being rich.
> there's no way humans won't get rid of useless humans
But they never did. Not even niggers were exterminated when we easily could. Not even idiots are exterminated, and those who tried now suffer: https://www.youtube.com/watch?v=g0JLXjFiVrs
> Also what, they won't want to give up their resources but actually they'll want to give up their resources for pleb to have a happy life?
Yes, so losers like you are not that annoying. Maybe so you even learn to be grateful. Is it a possibility? Gratefulness is one of the qualities necessary for becoming successful in life. Bitterness definitely doesn't help. Neither do entitlement.
> Where are those resources coming from?
Which ones? Raw materials are mined. They never belonged to you, and when they start mining asteroids your kind will also claim that those stars always belonged to you. But they never did.
> Oh long do you think they'll just keep giving away resources until they have enough?
They never get enough. The more you give the more you get (that's another of their canons)
>>
>>16548025
(the comment was too long, so here's the continuation)
> Since as you said, they worked for them, it's "theirs".
Consider it their "thanks", because, as I said, they practice gratefulness, and they noticed that you also work hard, and that the introverted geniuses upon which ideas they build their wealth are often come from your midst. The more of you, the more of those bright ones. The better you are doing, the more chances those geniuses have to get developed.
>>
>>16548042
No, I was speaking of that technology.
I watch the trend and in 1988 they couldn't keep foetus alive for more than two weeks, several years ago they managed to carry lambs for few months, so thirty years is very pessimistic prediction, chances are it is going to be available for humans any time now.
>>
>>16548077
>Also the usual cycle is first generation makes the wealth, second generation lives the wealth, third generation loses the wealth.
once you're technologically self sufficient you have an almost endless resource generating system with zero input. that will make it even less likely they'd ever going to lose it. lol
this is not about envy as I don't envy their life, it's pretty shit overall, they gotta work, they gotta deal with shit people, shit situations, stress, gotta constantly defend themselves. all for a bit better tasting something. once you had few prime pussy and tried some finer stuff in life it gets obvious it's all a scam. at least for some peeps with the 'tism. billionaires feel orgasm just like pleb do. they just gotta deal with an insane amount of extra shit for better tasting whatevers. most of them are looking into phone/computer displays like we all do, most of the day. they fly in planes? on computer online doing something. their conscious experience is almost identical to ours, just fancy leather seats or some shit. that difference is not worth stressing out like that.
this isn't about envy, this is about the risk of them having to cull everyone for lack of any working mechanic for everybody.
>Maybe so you even learn to be grateful. Is it a possibility?
are you grateful for what you're learning from random anon posts here?
>Gratefulness is one of the qualities necessary for becoming successful in life.
don't patronize me dipshit
>>16548077
>They never belonged to you
Then if someone can just take them from them, sort of like farming billionaires, is fair right? because someone smarter than them manages to extract their value, for a higher purpose of-course?
>>
>>16548094
>once you're technologically self sufficient you have an almost endless resource generating system with zero input.
Yes, this is what each of us should be striving for. Thriving. I work on a unit making humans safe and sound, fed and warm, like in a womb. With only electricity as the input, and whatever he feels doing as the output. And all my friends are too busy working their petty jobs to help me, so I'm the billionaire in the making, and chances are pretty slim, yet I somehow keep on striving, growing, even developing, not earning yet.
To give what I eventually build to those who didn't even help? Lol. Even if I can afford to pay them some day, they didn't earn it, they didn't build it, they just use my service of the job.
> are you grateful for what you're learning from random anon posts here?
I thought I was teaching here. I'm learning how to teach. Thank you. Yes, you're alright. Sent you the ray of my gratefulness. Felt it.
> don't patronize me dipshit
only to feel that. not very nice. I'm teaching you to think of it differently. What should I learn? That they're scam? Some of them obviously are. Inevitably. Some of you are also. So? If I missed something, please let me know, I was too busy explaining.
> Then if someone can just take them from them, sort of like farming billionaires, is fair right?
> because someone smarter than them manages to extract their value, for a higher purpose of-course?
It's not legal to farm humans as cattle anymore, but then crime is not legal, neither is revolution. To prevent revolution they try to keep you alright. Because in the world of Ai assistants propaganda is not going to work.
The world is changing. Buckle up.
>>
>>16548108
>I work on a unit making humans safe and sound, fed and warm, like in a womb.
that sounds interesting. some sort of sensory deprivation tank or?
>The world is changing.
welp that's for sure.
>>
>>16548113
>some sort of sensory deprivation tank or?
way better, the sdt is the closest on the market and it's ridiculous, my project is so much better that I'm puzzled why I cannot find investors. Living in russia's not easy, lol
So now I almost give it away for somebody else to join the quest. The market of that product is empty. There's place for everybody. And I've been dilly-dallying for so long that it's just my duty to pass it on at least in this form. And also to make sure that it is in the public domain so nobody patents them before me. When I succeed I may patent some devices for that thing, but then I'll have good legal specialists and good company to actually earn on those patents (most of patents are fines, inventors pay for them only to get nothing out of them, and in twenty years they expire, and I'm striving for so long that they'd expire alright)
>>
>>16547582
The human is like a plant. It's an evolutionary thing that is part of nature.

>I don't think we ever will understand the works of the human brain or of the ai brain
You don't understand it by knowing all the pathways just like you don't understand what temperature it is by looking at all molecules.
We take complex things and find the simplicity in it.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.