[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.
  • You may highlight syntax and preserve whitespace by using [code] tags.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: 61wbV8oqAbL(1).jpg (94 KB, 1500x698)
94 KB
94 KB JPG
Why do you need 24GB of VRAM? What will you do with it that 16GB can't? Most people don't need more than 12 and you NEET highschool dropout hobbyists can get by just fine with 8GB for your projects and old games.
>>
>>107842575
I like making my OC characters from games do lewd things with my friends characters, shit maxes out everything.
>>
>>107842575
SMUT! ComfyUI and koboldcpp best friends now.
>>
>getting money mogged by NEETs
A grim destiny for any wagie.
>>
>>107842575
I run a 24b model at full speed.
>>
>>107842575
erp
>>
>>107842575
I could run 500b model locally
https://www.youtube.com/watch?v=T17bpGItqXw
>>
>>107842575
I'm glad I boughted a 7900xtx when I did because we're about to reach a point where it's impossible for proles like us to ever own our own GPUs ever again.
>>
>>107842617
>>107842633
>>107842726
When the post nut clarity kicks in and you get over your porn addiction, you'll realise in anger you spent hundreds of dollars for nothing. AI is not a valid use case for humans doing actual work, and even then you're not editing 8K let alone 4K video, playing games at 4K, using 4K textures, or even rendering shit that takes up more than 8-12GB.
>>
>>107842741
>it's impossible for proles like us to ever own our own GPUs ever again
Really basing your decision to blow 700-1000 or more on a maybe such as this? Were people saying this in 2019/2022 as well before FOMOing in and buying high/selling low?
>>
>>107842821
You sound upset anon, what's up.
>>
>>107842821
I do not care, there is no post nut clarity either, I just enjoy doing it, most of the time I don't even goon to it, it just adds a sprinkle to ERP while edging.
I still support and comission actual artists if it makes you feel better.
>>
File: 1751339307756323.mp4 (3.94 MB, 1120x1120)
3.94 MB
3.94 MB MP4
>>107842821
>When the post nut clarity kicks in and you get over your porn addiction, you'll realise in anger you spent hundreds of dollars for nothing.
Has not happened yet.

>playing games at 4K
I play games at 4K, but I use my gaming PC for that, LG C4 42" and 4080S (less VRAM but faster).
I have a dedicated AI box, 3090's were cheap AI cards a while ago.
>>
>>107842821
holy seethe
>>
>>107842844
The RTX 5070 should have had 24GB to prevent this kvetching.
>>
>>107842838
Because I could and because it'll last me a very long time.
>>
>>107842912
The 5070 TI Super in 2027 fixes this
>>
>>107842935
WW3 is mid 2026 so doubt.
>>
>>107842869
>>107842857
>coom coom porn porn erp erp
Not beating the allegations for justifying a nice plane ticket's worth for a toy than a tool.
>>
File: 1743571672137079.png (229 KB, 720x720)
229 KB
229 KB PNG
>>107842957
Not strictly just for gooning but true my main use for GPUs if gaming and AI.
>>
>>107843043
By OCs I meant just our MMO characters, but yeah some of them like ERP-ing way too far into the OC category even comissioning shit.
Just mentall illnesses overall.
>>
>>107843051
Kek, based.
>>
File: 1759613734062665.mp4 (395 KB, 320x640)
395 KB
395 KB MP4
>>107842575
You would never understand.
>>
>>107842575
8gb more vram means i can maintain a larger context, offload more layers to the gpu, or run an embedding model alongside the llm. if you're just gaming go buy an amd card.
>>
>>107843073
I do understand, but it's the high before the low and you get bored. Unless you're rendering some crazy time consuming shit, you won't need much memory for most assets especially if you are in the low poly craze. And more VRAM won't help with faster renders if there are no high resoution textures to saturate it.
>>
>>107843101
Bitch please. The more VRAM the more options you have when it comes to genning.

>conventional rendering
That's so 2016.
>>
>>107842575
>Why do you need 24GB of VRAM? What will you do with it that 16GB can't? Most people don't need more than 12 and you NEET highschool dropout hobbyists can get by just fine with 8GB for your projects and old games.

Merchant trolling?
>>
>>107842821
>REEEEEEEEEEEEEEEEEEEE
lol
lmao, even
>>
OP doesn't actually want an answer btw
>>
>>107842935
That'll be $2999.99 plus tax, xir.
>>
File: file.png (10 KB, 598x145)
10 KB
10 KB PNG
to goon, of course
workstation cards are kinda being slept on lowkeyatm
>>
>>107843263
>nShitia
Not even once



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.