[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


>UPGRADE & BUILD ADVICE.
Post build list or current specs including MONITOR: https://pcpartpicker.com/
Provide specific use cases.
State BUDGET and COUNTRY or you will NOT be helped.
Building guide: https://wiki.installgentoo.com/index.php/Build_a_PC

>CPU
Web browsing: 12100, 5600G
Gaming: 12400F, 7600, 7800X3D
Workstation: 12700K, 7900X, 7950X
WARNING 13th/14th gen owners: Update your bios for the latest 0x129 microcode to prevent degradation

>COOLER
AIO: Thermalright Frozen Edge/Arctic Liquid Freezer III
Double towers: ID-Cooling FROZN A620 PRO SE, Thermalright Phantom Spirit 120 SE
ITX/>42mm RAM: Scythe Fuma 3/TR AXP120-X67

>MOTHERBOARD
AM5: https://www.youtube.com/watch?v=57X2FygcqLE [Embed] [Embed] [Embed] [Embed] [Embed]

>RAM
DDR4: 2x16GB 3200CL16. Budget, 2x8GB
DDR5: 2x16GB 6000CL30
Workstation/high end: consider 2x32GB or 2x48GB

>SSD (OS drive)
Low mid end: SN770
Premium: SN850X, P41
Flagship: Sabrent Rocket 5

>GPU
1080p: RTX 4070 Super, RX 7700 XT, RX 6750 XT
1440p: RTX 4070 Ti Super, RX 7800 XT, RX 7900 GRE
4K: RTX 4090, RTX 4080 Super, RX 7900XTX
Workstation: RTX 4000 Series

>CASE (from $ to $$$)
mATX: Montech Air 100, Lian Li A3, Asus Prime AP201, Lian Li O11 Air Mini
ATX: Phanteks XT PRO(ULTRA), Montech AIR 903 Base/MAX, Antec C5, Lian Li Lancool 216/III
AVOID: 'Silent' cases, fanless cases, 4000D airflow

>PSU
Budget: Gold rated 500-600W PSU
Mid range: ATX 3.0 compliant fully modular gold rated PSU @ 75% max load
High end: Seasonic PRIME TX
AVOID: GAMEMAX
PSU buying guide:
https://hwbusters.com/best_picks/best-atxv3-pcie5-ready-psus-picks-hardware-busters/ (updated Q3 2024)

>MONITOR
Standout:
1080p: MSI G244F E2
1440p: ASRock PG27QFT2A, Dell G2724D
2160p: Gigabyte M27/28/32U
https://pcpartpicker.com/list/tBTGQP

>OS
Activate Windows @ >>>/g/fwt

>CASE FANS
Meta: Case with good stock PWM fans
Budget: Arctic P12/P14 Max (5-pack)
High end: Noctua NF-A12x25 PWM, NF-A14x25r G2 PWM

Previous: >>102021808
>>
>>102030990
Reminder DLSS > native
>>
>>102030990
4070 super in 1080p, vro that is kinda Fre𝓪ky
>>
Post your unpopular PC Building opinions
>>
AVOID poomd products


>4K: RTX 4090, RTX 4080 Super, RX 7900XTX
>7900xtx
>4k
KEK
>>
everywhere I look I see Ryuuko
>>
File: 1708632994543026.jpg (32 KB, 657x527)
32 KB
32 KB JPG
>>102031030
my opinion is whatever pisses off AMD fags and NvidiaShillmassreplyfag off the most
>>
File: consider the following.png (1.38 MB, 1920x1080)
1.38 MB
1.38 MB PNG
>>102031030
All my PC Building Opinions are extremely popular though.
>>
>>102031023
imagine telling someone 6 years ago that his 1080Ti is a 720p card lol
>>
>>102030990
ryuuko a shit
>>
>>102031041
This. anyone expecting a 7900 XTX to carry in 4K is in for a very sore surprise. A 4080S is the bare minimum at this juncture.
>>
>>102030990
>>GPU
OP is is retarded that he organizes graphics cards by VRAM instead of performance and features
>>
>>102031041
>>102031135
>4k

Depends on the game
>>
>>102031155
>GameGPU
opinion invalid
>>
File: file.png (42 KB, 500x570)
42 KB
42 KB PNG
>>102031135
>>102031041
ononono
>>
>>102031214
>look! It’s a cherry picked game with worse quality FSR enabled and no I’m not gonna show you the details because then you’ll be able to confirm this is an extreme outlier

Hmm..
>>
File: 1721567631958595.png (43 KB, 310x315)
43 KB
43 KB PNG
>>102031233
average fps across 25 games at native 4k, look up the review yourself if you want
>>
can dlss/fg work on Linux (eos) with an Nvidia card? idk how to turn on hardware acceleration, I couldn't turn on fg in wukong without it
I'm dual booting anyway so it's not a huge deal, I'm just curious. i already know I can't use DLDSR on it
>>
POSTEM. finally setup my Linux boot
>>
>>102031030
Using RT is fucking retarded and you only buy nvidia because you don't want to deal with dog radeon drivers.
>>
>>102031199
>bought radeon
Opinion invalidated
>>
>>102031381
>you don't want to deal with dog radeon drivers.

Redpill me on AMD

Why do they get so much hate
>>
>>102031214
Except the reality is that Radeontards can't actually play anything new and demanding at 4K without FSR
>>
>>102031384
I had more issues after 2 years of radeon than after 15 years of nvidia. Maybe some people didn't and good for them. But I'm willing to pay more to not deal with driver crashes.
>>
>>102031384
switched from Nvidia to Radeon last year in April it has been smooth sailing since I switched, I doubled my perf and I am getting new features (Anti lag2, FSR Frame gen, FSR 3.1 upscaling) regularly thanks AMD
>>
>>102031491
>its another opposite day poster
All RTX cards have both the latest version of DLSS reconstruction and can use FSR Frame Gen if they want
>>
>>102031491
>Anti lag2
And oh yeah, even GTX 9 series cards have Reflex, which is better than Anti-Lag 2
>>
How do I make MSI repair my RMA'd mobo faster. I don't want to wait 4 fucking days + shipping when I know that they can fix it in less than an hour
>>
>>102031555
Womp womp must be frustrating for 3000 series owners who had to wait for AMD to give them frame gen, also cherry-picked super zoomed in screenshots classic
>>
>>102031579
If they have the service, ask for advance replacement, which means they ship a new mobo to you first (you put down a deposit), and they refund you when they receive your broken mobo
>>
>>102031574
Posting irrelevant info here since anti lag+ was replaced by anti lag 2 which runs on all cards from RX 5000 series and above
>>
>>102031617
Who cares? Anti lag 2 is in nogaemes, while Reflex is in hundreds, and has been for years
>>
>>102031651
Anti lag 2 is a new feature and so is only in two games yes but the list of supported games is only going to grow with time
>>
>>102031673
Thank you for proving my point
Radeon is a waste of money
>>
Personally I think Radeon's marketshare is well deserved, considering how good their products are
>>
>>102031684
shame Nvidia couldn't offer me better value then
>>
>>102031714
If you weren't a retard you'd know Geforce is better value for many reasons, especially DLSS
>>
>>102031555
>fsr
looks awful, are they even doing any actual upscaling or is it just a rebranded sweetfx filter
>>
>>102031740
Had a 2070 super, got a 6950xt with a brand new game (TLOUPT1) for 610 USD, Sold my old card for 200 USD doubled my perf the only competitor at the time was the slower 4070
>>
>>102031702
>>102031798
AMD Radeon RX 7800 XT - 16 GB, GDDR6, 256 bit
AMD Radeon RX 7900 XT - 20 GB, GDDR6, 320 bit
AMD Radeon RX 7900 XTX - 24 GB, GDDR6, 384 bit

Are they really that bad?
>>
>antilag
https://www.guru3d.com/story/amd-in-counterstrike-2-can-result-in-vac-ban/
Once again, AVOID POOMD
>>
>>102031822
they are fine, depends on what kind of games you want to play and what you want to do with the pc
>>102031842
>10 months ago
irrelevant fixed already
>>
>>102031822
Really bad value. You get more raw VRAM in exchange for worse drivers, worse features, and shit tier performance in modern render pipelines. It’s great if you’re a bro who just plays COD or MOBAs all day though but then you might as well go for the budget options
>>
>>102031798
The 6950 XT is slower in new games with worse image quality than the 4070. It's in the benchmark I posted
It's also a space heater, unlike the 4070
>>
File: 1723571969718744.png (16 KB, 337x450)
16 KB
16 KB PNG
Is the poojeet nvidia shill out on full force because of the 8800 XT?
Get those 10 rupees, saar
>>
why are you advertising a company that already has 90+% market share
>>
>marvel slop, hellblade, fortnite, my favorite games!
>>
>>102031822
They're fine. The problem is that if you're dropping $500+ on a video card you probably want the latest shiny graphics features (i.e. RT) and that's exactly where they fall off. If you, subjectively, don't like either RT or upscaling then feel free to go AMD. Or not - and this is the other problem - because the discount is quite small in relative terms. When you can afford a $1500 PC, $100 either way just doesn't mean much.
>>
>>102031899
I don't advertise anything. As a former Radeon owner, I would like to advise anons not to make the same mistakes I made
>>
>assassin's creed on 8 year old gpus, my favorite game!
>>
>>102031933
they fixed the dx11 driver on rdna2 and above
>>
>>102031877
Sorry I don't play any of the slop games you posted here I play Halo Infinite, Horizon Forbidden west, Helldivers 2 and BF2042, also nobody who is serious about playing Fortnite uses any of the advanced graphics features so that argument is irrelvant, Immortals of Aveum is a shit game that nobody plays, and I have no fucking clue what Fort Solis is
>>
>>102031933
>>102031968
What are the downsides of AMD GPUs
>>
>>102031967
This this so much this!! That’s why I went AMD, heckin 10% gain over nvidiots on AC2! Fuck yeah buddy! RT is for zoomers
>>
>>102031968
No they didn't
>>
>>102031970
>Halo Infinite
Xbox lol
>Horizon Forbidden west
PS4 port lol
>Helldivers 2
PS5 port lol
>>
>>102031992
Slower RT, and worse looking upscaler
>>
>>102031992
Shit drivers and more power draw.
>>102032000
>RT is for zoomers
It is
>>
rdoa4
>>
>>102031933
Funny I feel the same way about Nvidia. Guess there's no winning.
>>
Intel Battlemage dGPUs doko?
>>
>expeditions a mud uhhh my uhh
I've never even heard of that
>>
>>102032011
Sorry these titles don't fit your worldview of Nvidia = faster than Radeon this must be very disheartening, my condolences
>>
>>102031921
If I'm dropping $500+ I expect not to use budget card crutches like upscaling and frame interpolation.
>>
>>102032059
>>102032011
Alan Wake 2 will bring your 4090 to it's knees if you enable path tracing a crank everything up. Play a real PC game. Not console shit.
>>
>>102032052
There is an objectively better option
which is Geforce

>>102032070
Your expectations are retarded, as reconstruction is built into most demanding current games. Additionally, DLSS is better than or equivalent to native image quality most of the time, meaning you get a massive framerate boost often with better image quality
>>
>>102032094
Why do you talk like a marketer?
>>
>>102032090
>Alan Wake 2
Lol
>>
>>102032106
Marketers called people retarded and shitheads?
Fucking based
>>
>>102032090
All the "console" games ported to PC look batter and run faster for being on pc they look better than what you get on a ps5
>>
>>102031992
• Nvidia:
• Generally leads in raw performance, especially in the high-end segment (e.g., RTX 4080, 4090).
• Superior ray tracing performance due to more mature RT cores and software optimization.
• Better performance in AI-related tasks and machine learning, powered by Tensor Cores.
• AMD:
• Competitive in rasterization performance, especially in mid-range to upper mid-range GPUs (e.g., RX 6800 XT).
• Performance per dollar tends to be better in some segments, offering better value for gamers on a budget.
• Ray tracing performance is improving but still lags behind Nvidia.

Price

• Nvidia:
• Typically more expensive, especially in the high-end market.
• Often seen as the premium option, with a higher price-to-performance ratio in top-tier models.
• AMD:
• More affordable options across the board, especially in the mid-range.
• Offers better price-to-performance ratios in many cases, particularly in non-ray tracing workloads.

Software & Features

• Nvidia:
• Industry-leading software ecosystem, including DLSS (Deep Learning Super Sampling) which provides excellent performance boosts with minimal quality loss.
• Broader support for creative applications and professional workloads.
• G-Sync for smoother gaming with minimal screen tearing.
• AMD:
• FSR (FidelityFX Super Resolution) competes with DLSS but isn’t as refined, though it’s gaining traction due to broader compatibility.
• Software has improved but still seen as slightly less polished than Nvidia’s.
• FreeSync is widely supported and offers good performance for adaptive sync without additional cost.

Driver & Ecosystem Support

• Nvidia:
• Known for stable drivers and broad ecosystem support.
• Better support for Linux and professional workloads.
• AMD:
• Driver support has historically been less consistent, though it has improved significantly.
• Ecosystem support is narrower, with fewer optimizations for professional applications
>>
>>102032070
Agreed. Personally I don't much go for AAA games these days so this really isn't my fight (which is rather the point).
>>
>>102032125
Marketers love to read talking points off a checklist by rote. Them and chatbots.
>>
>>102032141
Hey chatGPT generate a summary comparing pros and cons of Nvidia and AMD consumer graphics cards!
>>
>7800X3D
>7900XTX
>SAMSUNG 990 4TB

Just works

>7800X3D
>4090
>SAMSUNG 990 4TB

Just works

Simple as
>>
>>102032170
>>
Can someone help me? I’m looking for a micro atx case that actually has hard drive bays, hope fully 6 of them. All the ones I see listed on scorptech (au store) don’t have any hard drive bays at all.
Doesn’t need a psu I have one already.
Need to be able to fit a gigabyte Gtx 980 in it.
>>
>>102032183
I'm not Chinese lol
>>
>>102032183
I FUCKING LOVE RAY TRACING I LOVE CUTTING MY PERFORMANCE IN HALF I LOVE UPSCALERS I LOVE AI FRAME GENERATION I LOVE RAY RECONSTRUCTION AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
>>
File: 90905_untitled-14.png (55 KB, 703x932)
55 KB
55 KB PNG
>>102032198
>>
>>102032214
Only on RadeonTM
Nvidia loses 6% on Medium path tracing
30% on High path tracing
>>
>>102032165
Pros of Nvidia Graphics Cards:

1. Performance Leadership: Nvidia consistently outperforms AMD in both raw power and efficiency
2. Ray Tracing and DLSS: Nvidia pioneered real-time ray tracing, revolutionizing in-game lighting and shadows for a more realistic experience. Their Deep Learning Super Sampling (DLSS) technology uses AI to boost frame rates without compromising image quality—a feature AMD struggles to match.
3. Software Ecosystem: Nvidia’s software suite, including GeForce Experience, offers seamless game optimization, driver updates, and features like ShadowPlay for easy gameplay recording. AMD’s software is less polished and lacks the same level of refinement.
4. Driver Stability: Nvidia drivers are renowned for their stability and frequent updates, ensuring your games and applications run smoothly. AMD has had a history of driver issues, which can lead to frustrating gaming experiences.
5. Industry Support: Nvidia’s CUDA cores are widely adopted in various industries, from gaming to AI research and content creation.
6. Resale Value: Nvidia cards typically retain higher resale value compared to AMD, reflecting their long-standing reputation for quality and performance.

Cons of Nvidia Graphics Cards:

Null

Pros of AMD Graphics Cards:

Null

Cons of AMD Graphics Cards:

1. Inferior Ray Tracing Performance: AMD’s ray tracing capabilities are a step behind Nvidia’s, offering lower frame rates and less refined visuals.
2. Software and Driver Issues: AMD’s software lacks the polish of Nvidia’s offerings, and their drivers have a reputation for being less stable, which can be a significant drawback for gamers.
3. Limited Industry Support: While AMD has its strengths, Nvidia’s CUDA technology dominates in professional applications, leaving AMD as a less attractive option for content creators and developers.
4. Lower Resale Value: AMD cards tend to depreciate faster than Nvidia, reflecting their lesser market demand and perceived value.
>>
>>102032222
consume the pathtracing slop goyim
>>
amd fans are losing their minds over rt kek
>>
>>102032249
Nvidia marketing for free? in this economy?
>>
>>102032250
>>
Has anything filled the void Optane left behind?
>>
>>102031030
Framerate is far less important than stability. It won't matter if you get 300 FPS when your computer hard resets in the middle of a game, or if you have constant stuttering.
>>
Looks like the RDNA4 leak got the local nvidia shill really riled up lel. Keep fighting the good fight!
>>
File: dlss vs vram.png (1.34 MB, 1298x1254)
1.34 MB
1.34 MB PNG
>>102032371
Post the "leak," haven't seen it
>>
>>102032385
the latest article on lemonparty.org
>>
16gb unlocks 1080p gaming
>>
>>102032417
it's clearly faster on the 8GB card
>>
>>102032417
>one of the amd sponsored abominations
>>
>>102032385
Holy shit.. DLSS is notably better than native res… incredible
>>
File: T6G2R0DDXao.jpg (10 KB, 224x224)
10 KB
10 KB JPG
Why is xhe picking only games nobody plays?
>>
>>
>>102032336
>Has anything filled the void Optane left behind?
Sadly no. Why haven't RAM Disks become a real thing now that high-capacity DIMMs are more affordable?
>>
What were the top gpus in 2004 and 2014
>>
>>102032194
>Can someone help me? I’m looking for a micro atx case that actually has hard drive bays, hope fully 6 of them
Your best hope:
https://au.pcpartpicker.com/products/case/#t=7,6&J=6,20&sort=price&page=1
^Fractal Design Node 804 when it comes back in stock at 165 AUD on MWave. There are other models that fit your requirements but I don't see them listed for stock.

If you are willing to live with only 4xHDD slots your options expand a bit:
https://au.pcpartpicker.com/products/case/#t=7,6&J=4,20&sort=price&page=1
The Antec VSK 3000 Elite & Cooler Master Silencio S400 both are availavle at 110 AUD.
>>
>>102032614
If you mean for gaming it's basically because any solid state storage shifts the bottleneck on to the CPU. They're working on GPU decompression but adoption is slow.
>>
>>102032369
that's a skill issue
you can have both
>>
>>102032614
They're in enterprise: CXL memory (it's made of DRAM but don't know if it is nonvolatile)
>>
>>102031041
Yeah, the 7900XTX is good in 4K for older games, not so much with anything modern.
>>
>>102031214
Where is the 4080 Super? That's the one people are actually buying, not the 4080.
>>
>>102032954
>>102032961
4080 Super and 4090 are 1440p cards. 5090 will be the actual 4K card
>>
>>102032887
Interesting. I wonder if it or CAMM will break out into mainstream desktop ecosystem soonish.
>>
>>102032994
What do you mean? I use my 4090 in 5120x1440 so not quite the same as 4k, but I get well over 100fps in most modern games at max settings. If it's a couple of years older, I'm usually hitting the engine limit. Yeah, some of the more recent shit needs frame gen if you use ray tracing, but that's fine with me in most single player games.
>>
File: file.png (580 KB, 913x599)
580 KB
580 KB PNG
If the PC case I'm buying comes with 4 fans, do I need to buy more fans?
I've heard you should always put in as much fans as your case can allow, but I'm wondering if its neccesary.

I'm scared that if I don't add extra fans, my parts will melt or something
>>
>>102030990
>MSI G244F E2
What's the catch? It's $100 and I can't find a single review not made by an Indian.
>>
https://www.youtube.com/watch?v=P7Eyv5d2Rq8
Worst launch since Bulldozer.
>>
>>102032454
One would assume games would run noticeably better on AMD's hardware considering both PlayStation and Xbox use their hardware.
>>
>>102033079
>Bulldozer

QRD on the dark days of AMD
>>
>>102033107
It's not the hardware. It's the developers and Indian programmers

Starfield end credits is 45 minutes long. Why? They outsourced the heck out of it.

27 different studios: Sonic Boom Sound, JSR Post, Iron Galaxy, The Multiplayer Group, Spera Soft, Snowed In Studios, GameSim, The Forge, Nobody Studios, Undertone FX, Wardog, Sparx*, Scyth Games, Rouge MoCap, RedHot, Kaptured Motions Inc., Airship Interactive, Lakshya Digital, NXA Studios, Goodbye Kansas, GL33k, FuryLion, Cubic Motion, Cloud Mark, 原力 aka Original Force, and 曼德沃克工作室 (Mindwalk Studios).
>>
File: 66026.png (57 KB, 600x600)
57 KB
57 KB PNG
>>102033108
all you need to know
(this is technically piledriver, but same shit)
>>
>>102032961
The 4080 Super is maybe 1% faster than a 4080 on average
>>
File: Bulldozer_Die_Map.jpg (60 KB, 500x442)
60 KB
60 KB JPG
>>102033108
shit performance and they lied about having 8 cores when its actually only 4 cores. at least they were cheap
>>
File: 1651789575108.jpg (53 KB, 500x500)
53 KB
53 KB JPG
I want to replace the thermal pads in my gpu with thermal putty, whats available and what should I be looking for?
>>
>>102032214
the 4090 caught up to 4k RT too fast which is why Nvidia is shilling path tracing.
>>
>>102033107
They don't
>>
>>102033213
Best thermal puttys in no order for VRM/VRAM: CX-H1300, UPSIREN U6 PRO, Zezzio ZT-PY6, Jeyi 8100, LK-PRO, EVGA Putty, Penchem TH949-1, Jeyi 8100, Penchem TH855-5, TG-PP10, Penchem TH930, KPT-8, MG860
>>
>>102033273
thanks
>>
>>102033107
The ones that do are the ones that heavily use L3 cache on the CPU and GPU and are the ones that usually run better on AMD than Nvidia. Call of Duty and Resident Evil games are two examples of this.
>>
>>102033108
They tried to scale up cores, but one of the compromises they made to get more cores was that each core would share the floating point unit with another core, so if you bought an eight core CPU, you would only get four working cores at a time when computing floating point calculations (you are basically always doing something that uses float).

One of the unintended benefits of the bad design was that it took high voltage well, so you could pump a million watts into it and overclock it. This didn't help much.
>>
The green goblin is back. I wonder how much he earns per post
>>
>>102032454
The sponsorship means nothing. That's how the game works.
>>
>>102033407
He just reposts the same folder every thread.
>>
>>102033453
AMD's sponsorship = high chance of tech disaster
>>
>>102030990
Anything I need to change?
https://pcpartpicker.com/list/Ddx7mD
>>
>>102033493
The Radeon garbage
>>
>>102033478
Sorry, but no, the game works that way just because the game works that way. That's what video games do when you run out of VRAM.

btw why are you talking about AMD? Could there be a reason? We all know there's no reason to complain about AMD or try to redirect when a legitimate issue like VRAM becomes a problem.
>>
>>102033506
>why are you talking about AMD?
Because it's sponsorships often lead to technical disasters
>>
>>102033504
I have really efficient heating in my house so I don’t need intel
>>
>>102033517
Nah, repeating the same shit isn't going to work and you know that. All anyone reading this is thinking is that you're redirecting as usual.

This DOESN'T work. You AREN'T in charge.
>>
>>102033493
You might consider a B650E motherboard that supports PCIe 5.0, because this would let you upgrade your graphics card later.

Be careful about which motherboard you buy if selecting a different one, a lot of them have bad VRM. If you don't care about upgrades at a later time, the motherboard you picked is fine.
>>
>>102033493
An XTX, even if it's a Nitro+, is hard to justify at that price when there are 4080S for a similar price point
You lose out on a bit of rasterization performance but marginally so, but you get all the Nvidia stuff
4070 vs 7800XT or 4060Ti vs 7700XT I lean more towards the AMD cards even at similar prices, but 4080(S) vs 7900XTX feels like an easy win for the 4080 unless you really need the vram
t. 7800xt user who definitely wouldn't trade it for a 4070 even if I was offered the option
>>
>>102033493
Looks solid to me. the power supply is much, but at that price you might as well buy a 4080. I own a 7900 xt
>>
>>102033548
>radeon
>cpu
You're not smart, are you?
>>
>>102033711
There's a Radeon iGPU in that CPU.
>>
>>102033506
>a legitimate issue like VRAM becomes a problem
This is the ((((((legitimate)))))) issue that Radeontards are talking about in AMD sponsored games like TLoU
>>
>>102033749
He had to scrounge for twenty minutes to come up with an argument. A screenshot of a game that had to be patched because of Nvidia's product line. That's sad.
>>
File: 7.webm (2.73 MB, 512x240)
2.73 MB
2.73 MB WEBM
>>102033749
>not shown in motion: worse 1% fps and dlss smearing on nvidia

AMD has accurate HW T&L
Nvidia doesn't
>>
Buddy you can spam screenshots all you want but you only need just 1 screenshot to convince me to buy the 5070 and that's a screenshot of the 5070 with at least 16gb vram and with a msrp of less than $550
>>
amd fans are so stingy
>>
forgive me if the question is actually retarded
say i want to do middling workloads or 1440p gaming
while i know I can suffice with a 4070 or the amd equivalent, but would getting something beefier like a xx80/xx90 or amd equivalent net be better temps since the card would not work as hard and due to heatsinks generally meant for the heavier workloads those cards are meant to do?
>>
>>102034114
No
99% chance the better card will work harder, produce more heat, and get the work done faster. Temps of the graphics cards themselves will be the same
>>
>>102034114
AMD has L3 cache while Nvidia doesn't. The more L3 cache the more efficient the GPU is. But this is assuming everything is working perfectly when usually its not because Windows is garbage and Linux is perfect.
>>
>>102034114
If you limit the framerate, the card will use less than its max electrical consumption. But you're using a card that by default consumes more in the first place. Both GPUs you'd be considering are the same architecture and process node. The architecture needs to overall spend at least as much electricity on the larger die as compared to the smaller. It's just physics.
>>
>>102034183
Least schizo AMD fan
>>
File: 1701459726351.jpg (48 KB, 562x1389)
48 KB
48 KB JPG
>>
File: 1080pGPU.webm (1.87 MB, 1280x720)
1.87 MB
1.87 MB WEBM
>>102031023
Nah, vram is too low for 1440p
>>
>>102034249
Turn off RT and it'll be fine
>>
>>102034249
Wow, you found the one game where RT + framegen is too much for a 4070 12 GB. That's a UE4 game, and fortunately UE5 doesn't have the same problem.

But let's have a laugh at Radeon's performance at those settings
>>
Does having background apps open consume up the L3 cache on X3D GPUs that it will affect games? Is there a way to monitor it's usage or something?
>>
>>102034357
CPUs*
>>
Thinking about switching to a 4k monitor so I can get an oled with not totally garbage text. GPU is 7800XT so probably not going to reasonably run things at 4k. How bad is running at 1440p and upscaling to 4k going to look compared to running native 1440p?
>>
>>102034388
FSR/Xess quality is 1440p on a 4k screen which is supposed to be indisguishable at normal viewing distance.
>>
>>102034413
>fsr
>indistinguishable
AMDelusion
>>
>>102034388
Run 1080p with 4x Integer Scaling.
>>
>>102034428
it's hard to tell the difference unless your eyes are right up to the panel seeing the individual alignment of pixels changing like a burning nerd who needs glasses
>>
>>102034442
>32" 1080p
This thread is full of bright ideas
>>
File: 1705685348928451.gif (1.87 MB, 416x346)
1.87 MB
1.87 MB GIF
Debating between 4 keyboards.

Wooting Two HE
Akko Anniversary edition HE
DrunkDeer a75
and
SteelSeries Apex Pro

I want a hall effect keyboard for sure after using one I borrowed for a little while, but I also want it to be good for typing and I heard the wooting isn't all that?
>>
Is 64gb ram a waste?
>>
>>102034272
Imagine not being able to use RT. cringe
>>
>>102034618
for gaming yes.
>>
AsRock X670E Taichi Vs. ASUS ROG STRIX X670E-F for a 7800X-3D?
>>
>>102034863
Neither. Just pickup a cheap B650 board and call it day.
>>
In general, what could cause spikes in frametimes? 0.30 msec to 50 msec spikes seemingly when something "new" happens.
>>
The time's here, the time's come. I must build a new computer and I got my sights on a 7800x3d. What kind of RAM should it go with?
I've always been an Intel person, so I have not much of an idea on motherboard models and the like, as well.
>>
>>102034899
Shader compilation
Can't avoid it unless the game has a comprehensive precompilation step
>>
>>102034899
unreal/dx12 shader being compiled
>>
>>102034900
DDR5-6000 CL30 is the sweepspot.
>>
>>102034900
HWunboxed recommends the Asrock B650M-HDV/M.2 as the go to budget board but honestly the 7800X3D will run well on anything because it uses so little energy.
as for ram 6000CL30 is what's recommended, don't forget to enable expo in the bios.
>>
>>102034940
>>102034918
Any suggestions for the RAM's brand?
Also, on the Mobo's model, I'm not sure if Asrock is the right option for my GPU, as I'm burdened with a 4090 that might require a better board. Not sure.
>>
>>102031434
neither can 4090/4080 without dlss kek
>>
>>102035047
yeah
but dlss is so good quality in most situations people dont mind

Then again no one should be buying 4k monitors for gaming, even if you own 4090 its better to go for 360hz 1440p oled and enable RT
>>
>>102030990
I have waited for 2 years for the 7900XTX to drop to a reasonable price like $800, but it hasn't happened. I don’t care; I’m going to buy the 8800XT when it comes out this year, and that will be the end of it. No more Just Wait®.
>>
>>102034899
if its not shaders compiling then its usually bad programming.

for example, ffxvi demo will not delete its shader cache if its corrupted or you change drivers. so next time you load the game its going to stutter like hell until you manually delete the cache.
>>
>>102034863
B650 Steel Legend
>>
has anyone ever actually tried to use semen as thermal paste?
I've heard about the joke a million times, but not once have I seen someone try and test it
>>
>>102035093
honeywell phase change pads prob work better so why bother
>>
>>102034988
g.skill is solid
>>
>>102035104
Thank you, Anon.
>>
>2 years worth of waiting for 5% gain
Amd is literally repeating what Intel did during skylake
>>
File: file.png (718 KB, 1197x650)
718 KB
718 KB PNG
Any opinions on this brand? It seems to have all the features I want but I'm not at all familiar with the brand. It's just going to be a secondary monitor so I'm not looking for anything amazing.

https://www.amazon.com/SANSUI-FreeSync%E4%B8%A82-1-4%E4%B8%A8IPS%E4%B8%A8Blacklevel-Adjust%E4%B8%A8VESA-ES-G27F2/dp/B0CRHC3QSL/
>>
File: where r2 tree 2.png (1.83 MB, 2251x1045)
1.83 MB
1.83 MB PNG
>>102034626
>rt
>picrel

>>102034618
with ai, 32gb is skimpy. with blender, 64 might be skimpy, but I'm glad to have it.
>>
feels a bit expensive but I wanna invest on a long lasting gaming pc that I can use for 5+ years
or are there cheaper alternatives
https://pcpartpicker.com/list/GrZhVW
>>
>>102035258
Apparently, to achieve an uplift software will need to be optimized for 9k.
>>
>>102030990
>[Embed] [Embed] [Embed] [Embed] [Embed]
>>
File: file.png (172 KB, 952x1072)
172 KB
172 KB PNG
>>102035258
Im kinda confused since they upped the node to tsmc 4nm from mixed tsmc 5nm and 6nm nodes
you'd think they get some free perf per watt from that
>>
>>102035350
>All the uplift for zen5 is because of tsmc.
>AMD is just and paste zen4 and change the power profile and call it a day
Yep sounds about right.
>>
>>102035258
A cometlake style 10-core ccx zen4 refresh with 40mb of base L3 (extendable to 120mb) would have been better
>>
>>102034357
>>102034366
If they did then those same tasks would affect other CPUs in the same way, so no.
>>
File: fhn43rh984rt.png (367 KB, 1226x657)
367 KB
367 KB PNG
>>102035305
You don't want 1080p stretched across a 27" panel. It'll look like shit.
Instead splurge another $20 for this 1440p IPS model:
https://www.amazon.com/dp/B0CY79PH3C?psc=1
"Only" 100Hz but as this is a secondary monitor, right-sizing the resolution and pixel density is far more important.
>>
>>102035258
QRD on the latest AMD cpus? did they fuck up?
>>
>>102035394
That's too much work for AMD.
>>
>>102035446
Gains are reliant on a w11 update that hasnt been rolled out. Once out expect to see about a ~11% uplift across zen 3-5 platforms.
>>
>>102035446
theres some benchmark weirdness going on with the power profile of the chips and branch prediction thats impacted by the OS
>>
>>102035432
I already have a 27" 1080p as my primary and I want to keep it that way for vidya performance.
>>
>>102035258
Way worse than Intel. Even during the worse uplift it was about 8% but you need to remember Intel releases a new CPU every year while AMD does it every two years
>>
>>102035446
Spent 2 years doing nuffin
>>
>>102035446
marketing promised higher increases than was found by doing the following:
use Windows dev branch (not available to general users)
admin account (full privileges to everything)
running competitor at baseline profile, ddr5-6000
>>
>>102031030
any amd or intel gpu is miles better than any nvidia for linux desktop usage

>t. linux fag
>>
File: 1520638933823.jpg (83 KB, 635x355)
83 KB
83 KB JPG
>>102035481
>but you need to remember Intel releases a new CPU every year while AMD does it every two years
This is an important point. For all the shit Intel got during those days (rightfully deserved) for switching sockets or chipsets every gen, you could count on at least a 15-20% total compounded performance improvement every two years due to their Annual Tick Tock Cadence of Die Shrink --> New Arc --> Die Shrink --> New Arc --> Rinse Repeat

AMD is slow to deliver and came empty handed this time.
>>
>>102031030
Most people are running fucked up monitor timings or calibration settings which is affecting their overall experience ranging from VRR, colors, brightness, etc.
>>
>>102035455
what if i don't want to run w11
>>
>>102035446
>>102035481
its partially a scheduling issue as seen in the uplift in Linux but I think the anon that surmised that they tried to get the IF to work reliably at 2400mhz and failed hit the nail on the head because it really seems like they beefed up the architecture significantly but they can't get it to breathe because the IF is stuck at 2000mhz.
the 9800X3D is probably going to be a monster, not the least because they've moved the vcache underneath the CCD so it no longer is going to get roasted and can run at regular speeds but also because they made the bottleneck that saw the huge gains in the 7800X3D worse.
>>
9800x3d WHEN
>>
>>102035747
next year
>>
>>102035747
march? april?
>>
>>102031030
Fans hardly matter for noise or lifetime if they are slowed down.
The biggest differences come with full speed operation but most fans are noisy regardless

Airflow and static pressure are a different discussion but there is only so much that can be squeezed out an axial design and diminishing returns hit pretty soon.
>>
>>102035350
N5, N4, and its variants are not particularly different.
>>
have the sinkclose microcode updates been released yet?
>>
>>102035704
You say all of this but the performance gain on the other models is shit.
>>
>>102035697
We dunno if they'll release an update for 10, they probably will later.
>>
Ryzen 9000 is great for people with actual jobs, gamers are just mad they weren't catered to.
AMD literally told them to wait for X3D instead or just upgrade their GPU (reminder: gaming is more GPU intensive) but they didn't and instead whined as loud as possible like the niggerfaggots they are
>>
>>102035889
if base 9000 is underwhelming the 3D models will just be underwhelmingX-3D no?
>>
>>102035926
9000x3d will get a 40% uplift compared to 7000x3d
>>
>>102035964
60% even
>>
>>102035926
Nah, this guy has a good bead on things >>102035704. The cache is more performant on Zen 5, it's just that infinity fabric is too slow. Making the cache bigger might actually show some gains for once.
>>
File: 1721956361175268.jpg (123 KB, 1281x1395)
123 KB
123 KB JPG
are any 4:3 or 16:10 (or similar aspect ratio) monitors on sale that DON'T have some horrific dealbreaking downside?
budget not withstanding
>>
>>102036000
>jj-just trust me bro. x3d version wont be a flop
>>
inb4 the supposed windows update does no uplift
>>
>>102036000
i wouldn't expect a miracle
>>
>>102036030
It's normal 4chan etiquette to post a frog/wojak when you have nothing to say and greentext shitpost.
>>
>>102036058
>frog
>>
So when should we expect Intlel to make a good GPU? I want lots of VRAM for cheap while also being able to play stuff decently.
>>
>>102036047
the update will make amd run all games in kernel mode with the down side being more vulnerabilities
>>
>>102035889
>AMD meant for the CPUs to be shit
>they are purposely made unappealing to gaymers
What kind of cope is this?
>>
>>102035889
AMDelusional
>>
>>102034388
Looking at it more seems like it is best for me to stay at 1440p but I still want to try an oled how much better is the text clarity on gen 3 qd-oleds? I could not stand the fringing on the gen 1 panel I looked at.

For woled the XG27AQDMG seems like the best option currently. The upcoming PG27AQDP uses the new subpixel layout and should actually have good text but $1000 for 27" 1440p is real steep.
>>
how are tech reviewers taking this news about the recent ryzen? they usually fellate and shill AMD whenever possible so are they downplaying this?
>>
>>102036160
Why do you come here to act like a gossip? gb2/v/
>>
>>102036160
if anything they've been a bit shitty about it
>ONLY 5%?
>>
>>102036160
amdunbox losing it
drama nexus in disbelief
linus cuck tips giving blow jobs
>>
>>102036082
they made youtubers run their machines signed as administrator as a fix...
>>
>>102036186
Based amd singlehandedly destroying the psyche of techtubers within 2 weeks
>>
>>102036188
(which btw fixed nothing because the "uplift" was the same across al CPUs)
>>
>>102036082
>down side being more vulnerabilities
durgahole exploit when
>>
File: hq720.jpg (44 KB, 686x386)
44 KB
44 KB JPG
>>102036229
>>
>>102036174
cause it's fun and i want to talk shit
>>
File: es.png (694 KB, 1356x1281)
694 KB
694 KB PNG
>>102030990
got me that cheap 110$ 7600 ES. I think I did good bros
>>
>>102036072
It's kind of hard to tell from following their Linux drivers. Intel is throwing out the core of their driver (called i915) and replacing it with a new driver (called Xe, which makes googling for this difficult because "Xe driver" tells google to give you articles from two years ago talking about Intel GPU drivers being bad). The new Xe driver is intended primarily for the new Battlemage cards, and apparently Battlemage won't support or at even work with the old i915 driver, you have to use Xe. Xe doesn't work that well with the current Alchemist cards, it seems Intel plans on never fully getting the Alchemist cards to work with the new driver.

As much as is possible, GPU manufacturers don't make separate drivers for Windows vs Linux, under the hood they mostly function the same. So Intel is for sure doing the same process with Windows drivers, it's just that they aren't publicly talking about it like they are in Linux. Intel has talked about how the Xe driver actually keeps more GPU processing INSIDE the GPU for Battlemage, for Alchemist one of the main driver problems was that the GPU keeps nagging the CPU to help it do some calculations. This is one of the reasons Alchemist runs way worse than you'd think given the GPU die space and supporting hardware like the RAM.

The first Battlemage product will be Intel's Lunar Lake, though only an iGPU, we'll have to see how it and its drivers perform when that comes out.
>>
>>102036280
your motherboard cost more than your cpu
>>
>>102036160
>directionbrain
They call good good and shit shit. This is shit, they call it shit.
>>
>>102036325
its around the same. b650m pg lightning is 110$ on amazon right now. three nvme slots and a 4th one can be had using the second pcie slot.
vrm is a bit anemic, its a 3x2 on Vcore, but I only plan to use single ccd chips anyway ;)
>>
>>102035065
>Then again no one should be buying 4k monitors for gaming
Always better to run the highest resolution possible with DLSS. Costs the same as lower resolutions and looks better
>>
>>102036344
This retard got shamed out of using TAA and thinks we're not going to notice DLAA, as though that isn't worse.
>>
>>102036216
actually, it only has an uplift on Zen3, Zen4 and Zen5, I don't think Intel CPUs are affected at all. Besides that bug is only like 2% difference vs the 10% uplift from Zen4 to Zen5 in Linux.
Windows scheduler is fucked but its only a tiny part of the problem with Zen5.
>>
>>102035889
this post mindbroke the intelnigger
>>
File: maxresdefault.jpg (126 KB, 1280x720)
126 KB
126 KB JPG
is this just a big chungus general? more itx please
>>
now gonna get me that 24H2 thing, built an iso via uupdump
>>
File: 1720292481131774.gif (1.22 MB, 320x288)
1.22 MB
1.22 MB GIF
>>102033946
>5070 with at least 16gb vram and with a msrp of less than $550
>>
File: 1591966136005.gif (78 KB, 480x269)
78 KB
78 KB GIF
>>102036388
>itx case
>18.25 Liters
you can get a matx case that's smaller than that overrated garbage
>>
>>102036438
could they fit a 4090? thought not
>>
Any good for the 7800x3d?
"ASRock X670E PG Lightning"
Will probably switch to a 9800x3d when it's released. Please advice.
>>
>>102036497
its fine.
>>
any real downsides of the MSI PRO B550-VC over the MSI PRO B550M-VC?

no wifi or bluetooth shit and is cheaper, seems like a better option to me
>>
File: 1688878863883529.webm (3.76 MB, 1080x1920)
3.76 MB
3.76 MB WEBM
>>102036367
Really is opposite day for retards, isn't it? The fact is I constantly shame retarded Radeonfriends like you for calling TAA native (and guess what, TAA is better than native, aka no AA). Even DLSS is usually better than or equivalent to TAA >>102032094, to say nothing of how good DLAA is, which is the pinnacle of AA methods at the moment.

FSR of course is below even TAA
>>
>>102036497
Asrock/gigabyte are safe picks
>>
File: winver.png (19 KB, 567x324)
19 KB
19 KB PNG
>>102036392
it's pretty nice
>>
>>102036513
>and guess what, TAA is better than native, aka no AA
No it isn't. TAA isn't AA, btw, Aliasing is defined as when a signal is displayed incorrectly not because of a bug or error, but because of a limitation of the technology. Anti-Aliasing is when incorrect information is deleted from the image and correct information is inserted, the limitation of real AA being that it might not delete all the incorrect information.

You are posting bullshit like TAA, DLAA, and DLSS, NONE of which are anti-aliasing technologies. They all guess what is and isn't an alias and then replace it with whatever they think is correct, meaning they not only delete incorrect information, they also delete correct information and then insert incorrect information. That's why both sides of >>102036344 look bad. MSAA and SSAA are the real AA technologies, because they only improve the image and not one pixel they change was an incorrect change.

I have no idea how you haven't been range banned yet, all you do is shitpost, you're immediately visible and all you do is push consistent lies because of your shitty brand loyalty.
>>
>>102036465
NR200
>ITX mobo
>18.25L
>334x160mm

Mechanic Master C28
>Matx mobo
>17.9L
>335x162mm GPU

I win the argument.
>>
>>102035388
If it was just a copy and paste of Zen4, then it wouldn't have such terrible cross-CCD latencies.
>>
>>102036628
>latency worse than intel p-e cores
amdover
>>
File: input vs output2.jpg (426 KB, 2007x1202)
426 KB
426 KB JPG
>>102036598
>TAA isn't AA
>temporal anti aliasing isn't anti aliasing
You're retarded, enough said
>>
>>102036649
>temporal anti aliasing isn't anti aliasing
>posts a picture showing TAA creating aliasing
The offices behind the glass and Fuyutsuki sign on the top is entirely smeared out of existence. It doesn't do that if you turn TAA off. There are similar errors all over that screenshot created by TAA, like the deleted white parking lines and all of the text on that circular sign being turned into a blob.
>>
>>102036598
as much as I dislike rabid nvidia fanboy over there, we can't live without TAA anymore in modern engines
I honestly run FSR only because it removes the smear. And it looks better than native that way, it's stupid. Thankfully they started adding FSR native options now aka DLAA bullshit marketing nvidia uses. It's all same thing.

One thing I don't know yet, if 4k helps with TAA. In theory it shouldn't be as blurry.
>>
>>102036726
>It doesn't do that if you turn TAA off
yes, but it creates all sorts of problems in other areas, way worse problems, I tried running 2077 without it, its clean sure, but any reflection or shiny surface is terrible
>>
>>102036726
You really are delusional
https://youtu.be/WG8w9Yg5B3g?si=YbVjOcxjhUlMhzRS&t=1046
I'm sure you play all your games native. The fact is that you're too stupid to know you're looking at TAA most of the time
>>
>>102036740
>as much as I dislike rabid nvidia fanboy over there, we can't live without TAA anymore in modern engines
TAA looks worse than no AA. It smudges everything just like FXAA because it's literally the same thing except with a supposed temporal gatekeeper that's supposed to prevent FXAA's most egregious errors. It mostly does not, the frame still mostly looks like FXAA. On top of that, it's temporal, meaning it creates ghosting everywhere that you have to turn off in the game's .ini file or something half of the time.
>>
>>102036740
>It's all same thing.
This is what retarded poorfags think. FSR Native is the same ugly reconstruction as any of the other FSR modes
>>
>>102036762
>The fact is that you're too stupid to know you're looking at TAA most of the time
No I'm not, you're an idiot who insists we play games with awful visual quality like you do because of your brand loyalty.

The christmas lights in your link look wrong, btw. For instance the street banner is supposed to blink on and then blink off, with TAA enabled it fades in and out incorrectly. It also for sure looks wrong in motion, you can see Spiderman and the pedestrians are blurred on the edges when they move, there's no chance the trees and lights look correct when Spiderman or the camera moves. Your example video is avoiding that.
>>
File: E0Xirb1VcAIJU1O.png (246 KB, 531x395)
246 KB
246 KB PNG
MSI Cyborg 15 A12V for $800 dollars new, is it worth it? My other options are an Acer Nitro 5 and a Lenovo LOQ, but those come with the 12500 and 12450, which are 4 cores, and the Cyborg comes with a 12650h with 6 cores, I will be using it for audio production, mainly Pro tools and some casual gaming.
>>
File: 1715958319253325.png (1.33 MB, 1600x900)
1.33 MB
1.33 MB PNG
>FSR slower than Xess on amd hardware
does anyone care about software at amd?
>>
>>102036766
all true, but at the same time it's integral aprts of 80% of the effects devs use
for example if you disable TAA in RDR2 trees look naked in the distance, again it's stupid but that's what it is
I have ways to fight it on AMD side, image sharpening in driver and FSR at 100% resolution in game. RSR also helps if FSR not supported, you drop in game resolution 5% down and it activates, still cleaner than native.

>>102036780
why did you post balanced comparisons then eh?
>>
>>102036841
have you even tried xess? it ghosts like crazy
>>
File: please-respond.jpg (61 KB, 500x500)
61 KB
61 KB JPG
>>102036512
>>
File: es.png (777 KB, 2089x1074)
777 KB
777 KB PNG
>>102036547
heh, looks good to me ye
>>
>>102036827
Don't you have an overnight shift a McDonald's to drag yourself to? It's time for my beauty sleep

>The in-game TAA solution has very poor rendering of small object detail—thin steel objects and power lines, tree leaves, and vegetation in general. The in-game TAA solution also has shimmering issues on the whole image, even when standing still, and it is especially visible at lower resolutions such as 1080p, for example. All of these issues with the in-game TAA solution are resolved when DLAA, DLSS or XeSS are enabled, due to the better quality of their built-in anti-aliasing solution. Also, the sharpening filters in the DLAA, DLSS and XeSS render path can help to improve overall image quality. With DLSS and XeSS you can expect an improved level of detail rendered in vegetation and tree leaves in comparison to the in-game TAA solution. Small details in the distance, such as wires or thin steel objects, are rendered more correctly and completely in all Quality modes. With DLAA enabled, the overall image quality improvement goes even higher, rendering additional details, such as higher fidelity hair for example, compared to the in-game TAA solution, DLSS and XeSS. Also, both DLSS 3.1 and XeSS 1.1 handle ghosting quite well, even at extreme angles.
https://www.techpowerup.com/review/cyberpunk-2077-xess-1-1-vs-fsr-2-1-vs-dlss-3-comparison/

>>102036866
>why did you post balanced
Because no one bothers even looking at the garbage that is FSR Native

>>102036877
Depends on the game; usually it's better than FSR.
https://www.youtube.com/watch?v=giBaJIOyIsI
>>
>>102036512
The former is ATX sized which is Gucci
The latter is mATX which sucks for a DIY build
>>
>>102036901 (me)
I just want to highlight
>All of these issues with the in-game TAA solution are resolved when DLAA, DLSS or XeSS are enabled, due to the better quality of their built-in anti-aliasing solution
again for the retard who said TAA, DLSS, and DLAA aren't AA

And with that, I bid you goodnight. Radeonfriends, sleep tight. Don't let the bed bugs bite
>>
>>102036901
>Don't you have an overnight shift a McDonald's to drag yourself to? It's time for my beauty sleep
You aren't fooling anyone, this is that kind of projection that you can tell really applies to you. You're habitually posting links and screenshots of bullshit lies for a brand that only you care about. You're a loser. Nice greentext, as usual it does fuck all to talk about what real image quality looks like and you just posted it because you're hoping we're all going to be scared to challenge you on the basis that our eyeballs can't see the aliasing you so love.
>>
>>102036840
pls respond
>>
File: snapshot_07.32.260.png (2.49 MB, 1920x1080)
2.49 MB
2.49 MB PNG
>>102030990
ryuuko is a good fap
>>
anyone wanna help an oldfag make something new? My last build was back when the i7 4770k was hot shit lmao -- hoping to make this new machine last a similar time frame

https://pcpartpicker.com/list/NHnYMV
>>
>>102036995
Why are you buying a mobo with wifi and also a wifi card? You need two wifi adapters?

DDR5 6000 CL30 is best RAM, btw, not CL36.
>>
>>102036956
>MSI Cyborg 15 A12V
I would get something with a iGPU with better CPU cores if you want music pro tools
dedicated GPU in a laptop is a waste.
>>
>>102036513
>native, aka no AA
Native refers to resolution, as in the game is rendering at the resolution of the screen it is displaying on. It has nothing to do with what AA is or is not in use, you can render at native resolution and use all kinds of AA or none at all. DLAA for instance renders at native resolution and then applies AA too.

Also native with no AA on a decently high-DPI screen (4k 27" for instance) looks better than TAA. TAA is just complete trash, it's almost never worth using unless the game has a super shit renderer which breaks when TAA is off.
>>
>>102037031
>it's almost never worth using unless the game has a super shit renderer which breaks when TAA is off.
99% of games past 2019
>>
>>102036995
Do you live near a MicroCenter?
>>
>>102035624
This but Linux is also better at games.
(Cyberpunk has been about 15% faster on Linux for months now but nobody noticed, kek)
>>
>>102037046
>Cyberpunk has been about 15% faster on Linux for months now
what about 150 other games? question is consistency across the board, it's like intel GPUs they work only for select few games well
>>
>>102037056
Intel GPUs "work" with basically every game. In fact most games that don't work on Arc don't work for reasons entirely unrelated to the card or its drivers. Like that Batman game from a decade ago that just checks the GPU manufacturer and refuses to even launch if it says Intel. You have to hex edit the game's .exe to correct the issue.

Likewise, Linux gaming works fine unless it's something like kernel anti-cheat or a game that is both badly programmed and obscure (like Hearts of Iron 3, in my case).
>>
>>102037036
99% of the games you play I guess, not me. The only one I remember having that problem was RDR2 many years ago.
>>
The plastic on my AMD wraith stealth cooler looks like it has AIDS. Are there knockoffs of these?
>>
>>102037089
fallen order got terrible taa, disabling it makes game look weird, not clean
>>
>>102037026
6000 28-36-36-96 is the new best ram for 'zen
>>
File: xdgdg.jpg (6 KB, 250x191)
6 KB
6 KB JPG
>>102037027
The fastest processor I find for my budget is the 12700h, and it's just as fast as the 12650h because audio production is mostly single core performance. That's why the Cybrog 15 is my option, it has a better processor even if the 4060 it has is a low tdp one.
Am I wrong in believing this?
>>
>>102037026
>two wifi
cause I smoke too much weed, and I've looked at too many pc parts recently to keep it all straight lol

Whats the diff on CL30 vs CL36?

>>102037037
no

>>102037144
why is that?
>>
>>102036616
nice case but why be mean to my bro nr200, he's sensitive about his size T_T
>>
>>102037172
I don't know, look at specific benchmarks for your tools, if you want to run it autonomously I'd scrap dedicated GPU idea.
Look at APUs, I doubt it's more expensive to get an APU laptop with fresher cores than the dGPU one.
>>
>>102037207
>Whats the diff on CL30 vs CL36?
When the CPU makes a call to the RAM asking the RAM to do something, the RAM will not initially respond. Cas latency is a measure of how long it takes for the RAM to begin responding to the CPU's request. CL30 means that it will take 30 clock cycles before the RAM finally starts working.

Imagine you called someone on the phone and want information from them ASAP. RAM speed is how fast the conversation can gets done after the other person has picked up the phone. Latency is how long it takes for the other person to pick up the phone in the first place after you've dialed their number. If you have to make hundreds of phone calls, that delay matters.
>>
>>102037280
Checks out, really appreciate the explanation.


Is the build lookin half way decent otherwise?
>>
>>102037321
Looks fine.
>>
File: 1712123859508405.gif (2.35 MB, 272x480)
2.35 MB
2.35 MB GIF
please forgive my amateur question im very unfamiliar with building computers:
i have pre built pc with a geforce rtx 2060 that is currently dying, and i'd like to upgrade. is there anything that i need to watch out for when buying a new gpu specifically with regard to compatibility with my current motherboard/cpu? or can i buy any regular old rtx 4070?
>>
When will amd give us more cores
>>
>>102037380
First, the new GPU needs to actually fit in the computer. Newer cards are longer and bulkier. When buying a card, it'll list its measurements. You'll need to open the computer and compare the measurements to available space where your current card sits.

Secondly, your power supply needs to be capable of actually plugging power cords into the new GPU, and once plugged in its 12V rail needs to be powerful enough to fully power the GPU. You need to check the PSU model you have, you can post it here and we can check.
>>
>>102037403
>what the fuck is threadripper
>>
>>102037424
thank you
>you can post it here and we can check.
i will do that tomorrow, or maybe never
>>
>>102037441
I think he means at the same price point as yesteryear's models.

>>102037403
They can't without TSMC's N3, and even then a 16 core CCD is probably some time into the future. I don't think anyone wants Zen 5c 16 core CCDs.
>>
>>102037441
is it too much to ask for ryzen 5 to bump up to 8 cores instead
>>
>>102037474
>>102037474
>>
>>102037468
>They can't without TSMC's N3
they absolutely can, N4 Zen4C CCDs are 12 core.
>I don't think anyone wants Zen 5c 16 core CCDs.
enterprise wants as many cores as AMD can give them, I assume Zen5C will be N3
>>
>>102034215
literally me XD
>>
>>102033039
Only put in what's necessary. There's many safe guards built into hardware to prevent thermal damage and fire. Stay away from bleeding edge, first revision hardware and you should avoid housefires. The biggest fans you can fit are generally the best option. Most cases work best with intake bottom and front, and output back and top. If you max out number and size of fans you'll probably be able to get your setup as quiet as possible. More surface area of fan blades allows slower speeds, means less noise.

There's too many other factors, case size, gpu, etc to make a determination on the part of other anons. You'll have to decide yourself or very specifically detail your setup.
>>
File: 1672916050069648.png (200 KB, 361x363)
200 KB
200 KB PNG
>>102034215
i stopped waiting today
>>
Out of these two options which one is less awful for the same price? My budget can't get any higher than that.
https://pcpartpicker.com/list/7GCGrv
https://pcpartpicker.com/list/RCR2h3
>>
>>102038226
Both have heavy compromises.
The am4 board has awful vrm unlikely to run an 8 core and your am5 kit has 8GB dimms which are slower than 16GB sticks due to ddr5 peculiarities
>>
>>102038247
And if I switch the AM4 board for a better one?
The idea is - cheap AM4 vs absolute poorfaggotry AM5. What makes more sense at the moment?
>>
>>102038294
am5 with 2x16 memory kit is the best option
>>
>>102038315
I see, but then it goes way over the budget. That's why I don't know whether it's worth it to go for AM5 with this kind of compromises or I should just build more comfortably on AM4 with more RAM or a better motherboard. Probably the latter one if 8Gb DDR5 sticks is such a bad idea.
>>
>>102038522
buy used parts or order the 7500f from aliexpress?
>>
>>102038574
>7500f from Aliexpress
Good suggestion, but that's already the plan, same for 5700X, otherwise I'd have to go with 5500 or 5600.

Maybe I can find something used tho. Currency is inflating in my shithole, so I think I shouldn't wait with the upgrade.
>>
>>102038699
I use AM4 myself and don't feel the need to upgrade
>>
File: 1720384487274489.jpg (1.49 MB, 1600x1200)
1.49 MB
1.49 MB JPG
building a computer is a waste, just get an oled steam deck
trust me bros
>>
>>102038797
Then I'll just build on AM4 if I won't find anything cheaper and figure something out in a few years. Not that I really need constant upgrades - my current stuff is from 2012 and it still sort of runs vidya.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.