what would push a 4070ti harder1080p@120fps or 1440p@60fpsalso do you personally prioritize resolution or framerate?
resolution pushes GPU more, the framerate is just the result.The goal is to get bottlenecked by the GPU, so you wont be held back by your CPU which causes unplayable stutters.that said, with unreal engine you'll be stuttering regardless and your resolution wont matter, its gonna look like smeered shit regardless.
higher framerate pushes your card more than resolutiondon't listen to the retarded above me
higher pixels pushes your card more than frameratedon't listen to the retarded above me
>>719827036>>719827078>>719827201
>>719826719This is exactly the kind of shit AI was made for.>The 1440p at 60 FPS scenario is likely to push the Nvidia GeForce RTX 4070Ti harder than 1080p at 120 FPS in most modern games, particularly those with high graphical fidelity (e.g., ultra settings, ray tracing). The increased resolution at 1440p demands more from the GPU’s memory bandwidth, texture processing, and shader performance, outweighing the frame rate demands of 1080p at 120 FPS. However, the specific game, graphical settings, and CPU performance can influence this outcome. For a definitive assessment, testing both scenarios in the target game with identical settings is recommended.
>>719828993so basically using more memory work cards harder than simply rendering frames faster
>>719826719>>719827372Toxotransmosis monent
>>7198267191440p is 3.7 megapixels1080p is 2.1 megapixelsTherefore 1440p at 60 FPS is a bit less demanding than 1080p at 120 FPSIf you have a 1440p monitor you should always play at 1440p, as 1080p is a non-integer downscale and will look like blurry dogshit. Shoulda sprung for a 4K that way if you need to play 1080p you can without issue.
i'll stick to 60 because most asian games run like shit or can't even run above 60 and old games are stuck at 60
>>719828993This is a great example of how AI is fucking retarded and just makes shit up. You gain about 50% more frames at most by switching from 1440p to 1080p.https://pc-builds.com/fps-calculator/result/1cl1fl04C/core-i7-12700k/geforce-rtx-4070/elden-ring/3840x2160/https://pc-builds.com/fps-calculator/result/1cl1fl03G/core-i7-12700k/geforce-rtx-4070/red-dead-redemption-2/3840x2160/https://pc-builds.com/fps-calculator/result/1cl1fl0km/core-i7-12700k/geforce-rtx-4070/monster-hunter-wilds/3840x2160/
>>719830274Thank you for the correction and for providing sources. I appreciate the clarification. It's important to ensure accurate information is shared, and your input helps with that. I’ve updated my understanding accordingly.
>>719826719720p 60fps is enough
i use a custom 480x360 1000hz grayscale monitor (for pro gaming)its only good if you have a deep understanding of the game youre playing because you cant see shit
>>719830274The question was how hard the card is being pushed, not what framerates to expect. What the fuck are you even talking about? Way to get mogged by Grok, faggot.
>>719832097How hard the game is getting pushed determines the frame rate you fucking idiot. If you can run a game at 1440p 60 FPS but dropping to 1080p doesn't give you 120 FPS, that means that 1440/60 is less demanding than 1080/120, which is entirely predicable based on the pixel density alone.
>>719832259How hard the card is getting pushed, rather
>>719831875What the fuck are you? A weasel?
>>719826719I play everything at 1080p. Currently use a 3080 and a 4070 setup. I could use my 4k screens but I figure, why push a graphics card harder than I need to? I don't really notice a difference between 1080 and 1440 anyway. And just to be clear, I'm aware that there is a difference, it just personally makes no difference to me.
>>719826719Don't think about hur dur whats le optimal numberz specs omgggggJust ask yourself>do I like to 24" screen more>or 27" screen
>>7198267191440p60, would never go back to 1080p after experiencing 1440p. But 4070 ti can push 1440p 120 fps no problem.
>>719832259assuming you're not cpu bottlenecked at that point
idk. its not always linear and depends very much so on post processing and how its done in that particular game.for me, native resolution (currently 1440) is non negotiable while at least 120+ fps is ideal but i can deal with fluctuations, within reason, depending on whats happening.
>>719834860CPU bottlenecking actually decreases the relative cost of higher resolution rather than increasing it.
>>7198267191920*1080*120=248,832,000 pixels per second2560*1440*60=221,184,000 pixels per secondSo 1080p/120fps is 12.5% more pixels per second than 1440p/60fps. However, I don't know if this is actually how it works. There could be other factors.