[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.
  • You may highlight syntax and preserve whitespace by using [code] tags.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: 1726291842743043.png (402 KB, 2124x1734)
402 KB
402 KB PNG
>M4 Max, 128GB unified RAM

What is G's verdict on running local LLMs on this?Should I drop 5 grands to get my own coding assistant and local anime girl generation ?
>>
>>103295728
>What is G's verdict on running local LLMs on this
i think you should hang yourself for paying $5000 on a laptop
>>
>>103295739
It's cheaper than four 4090s.
>>
>>103295739
>$5k for a fucking laptop
Name another machine with this kind of GPU memory for lesser price.
>>
>>103295728
>5 grand to have the "(v)ram" be slow as absolute shit
lol, either rammax for like 300$ and be able to run everything but slowly or rent gpus online or pay for top current model access if you dont care about privacy
>>
>>103295728
get the 14
>>
If you can afford it, why not? I got a 14 Pro/20 Core GPU/48GB/2TB Nano display for like $3300.
>>
>>103295728
Are giant models that much better?
8b LLMs and SDXL has served me well and those run on the base M4.
>>
>>103295728
if you're a crypto bro and made a killing recently off the market then hell yes.
>>
>>103295728
>272GB/s
Trash
>>
>>103295739
Get a job fucking poorfag
>>
>>103295728
macOS support for open souce apps is trash
>>
>>103297114
Thats fucking retarded, the majority of open source software are programmed on macs.
>>
>>103297036
i could buy 9000 burgers with that money
>>
>>103296664
how is the display?
>>
>>103297007
576GB/s on the max
>>103297114
good
>>
>>103295755
Gpu memory isn't everything to benchmark a gpu performance.
Sure it helps with some AI training but if its slower why not get a proper GPU?
>>
>>103295728
no
get the new mac studio m4 ultra next year
>>
>>103297760
Its everything to AI workloads. Bandwidth + memory + fast cores.
>>
>>103297760
>AI training
you literally can't run the model if you don't have enough memory.
>>
>>103296915
i bought a quarter bitcoin the moment trump was elected, it's money but not maxed out applefag money



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.