g-guise, we're fucked
they said that about GPT-2 as well, it's just marketing/hype until proven otherwise
>hire army of security-focused SWE’s to find exploits in linux, openbsd, ffmpeg over the course of months>pretend its the new scary model casually one-shotting foundational softwareshould literally be illegal to do this bullshit
>>108559827Reminder that "Open"AI considered ChatGPT 3.5 as too dangerous and thus decided to not be non-profilt OpenAI anymore but a for-profit company who releases its models closed source and behind an API and security guidelines for "safety" reasons.Today we have uncensored open weights and finetunes that are magnitudes more powerful than ChatGPT 3.5
Ah yes. The China approach to marketing
They found the infinite money glitch. Just perpetually claim your AI shit is "too dangerous to get released" and continue extracting money from overly excited investor piggies.
It’d be hilarious how they keep making this bullshit up if they weren’t using it to steal money and force entire industries out of work.
>>108559827>Hey can I copy your homework>Yeah just change it so that the teacher doesn't noticeWaiting for the Google one for Gemini 3.2 or whatever the fuck they decide to call it
>>108559827
It’s genius 4D chess>develop product>it’s shit>utter, complete dogshit>crap can’t sell this>cancel product>tell people it was too powerful, too dangerous>y-yeah! so good it’d break the world or something!>take old product, rebrand it, sell it as a “safe” version of the deadly poo product.
>>108559827Bill Joy made a text editor so good, but it got released anyway.
>>108559827Didn't they do this same theater with GPT 2 back in the day. And then with GPT 3. Don't remember if they tried the same with 4. With 5, they knew they had a disappointing dud.
>>108559827Buy an ad
Get ready to prompt like a billionaire
>>108559857>find exploits>found a few non-exploitable crashers
>>108559827ITT: Bots don't know today's date
>>108560236yeah reading the replies of this thread is absolutely grim
>>108559827didn't anthropic just do this with mythos.
>>108559827>my product is so good guize trust me>i am definitively """"not"""" releasing it! xdddd>investers giv money plox
>>108559827remember when altman posted the death star before the release of gpt-4.these fags are snakeoil salesmen
Finding exploits in open source projects is shooting fish in a barrel.Give it to a complete amateur (doesn't mean stupid, just someone who never chased bug bounties or did capture the flag) for two months to make money at the next pown2own, with some pledge to not collaborate with anyone but the model. Lets see how it does.
I have AI exhaustion. Can Mythos make me a suicide pod so I no longer have to deal with this slopiverse?
>>108560309And it just keeps working. Shareholder cattle is that dumb.
>>108560013this
>>108559827>it's so crazy scary good, goy!>no you can't see it.>gibs moneys pl0x.I have to admire the sheer chutzpah.
>>108559827this time for realsies
>>108560246>>108560236>he thought /g/ wasn't retards
So what, I did that too. No I won't show you or release it, just trust me.
>>108559827it's propaganda, it's still as horse shit as the rest and this dogshit company will keep bleeding money until they disappearrest in piss bozos, they won't be missed
You thought Mythos was scary!? Wait until you see our new super sekrit model we're developing.It's so advanced and dangerous we sandboxed it and then it ESCAPED!! and then it sent me an email telling me about it after I gave it the instructions. HOLY AGI BATMAN!We don't have a name for this monster but right now, internally, we're calling it the "Doomsday 6.66"
>>108560309burn gorman is looking like shit these days.
>>108559827There are more than three fucking threads advertising this shit on /g/ right now. Do you cunts actually program or are you all marketing bots?
>>108561798>Do you cunts actually program>/g/lol. lmao, even.
>>108559827IN A WORLD...
>>108559963I wonder what the containment was. We're meant to assume it was some ultra secure shit, but I bet it would be laughable. It might even be something where instructions ro escape were included in its training set.
>>108559827It turns out they were right, everything is now spiraling out of control and we have people deciding when to drop nukes asking public ChatGPT instances if its ok to do so. They unironically should have killed it then and there.
>>108560127jump into an acid vat
>>108562093NEVERESCAPE.md