[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/v/ - Video Games

Name
Spoiler?[]
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File[]
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: qual.png (496 KB, 580x433)
496 KB
496 KB PNG
Would you rather..

>Play 1080p on High Settings
>Play 1440p on Medium Settings

Convince me if upgrading to a 27 inch 1440p monitor is worth it. It just seems like you are getting worse performance. Also I'm sure 720/1080p videos and streaming are going to look worse.
>>
>>729064179
I fucked up and upgraded to 1440p. I can run games fine but it really cut into my average FPS, so I definitely wouldn't say it was worth it.
>>
>>729064476
Yeah they say it's a 20-30% fps hit. I don't know if lowering settings can make up for it.

How about videos/streaming? Is it tiny or pixelated since it's usually 1080?
>>
>>729064179
With modern games(TAA), a higher res does more to the visual quality then settings. Most games today also leverage uspcaling and 4k dlss looks better then native 1440p and a hell of a lot better then native 1080p with a higher fps then native 1440p, that at this point, if you play modern games, you are better off with a 4k monitor
>>
>>729064756
Just use DLDSR for games you can run at 4k.
4k is a meme, I own a 4090, the second best GPU on the market, and it's not strong enough for comfy 4k.
DLDSR is very close to the 4k output when you can afford it fps wise though, I know that because I've been using DLDSR 1440P on my old 1080p rig and noticed very little difference moving on to real 1440
>>
>>729064740
>How about videos/streaming? Is it tiny or pixelated since it's usually 1080?
No, that's not an issue for me.
>>
>>729064179
>can you even tell?
not with some retard's dumb fucking fat and color gradient in the way, faggot
if you're going to repost shit for (You)s, can't you find better content at the very least?
>>
>>729064756
>>729064825
Are you talking console or PC?

4k is definitely not going to be fun without xx90 card on PC
>>
File: sashimi.jpg (161 KB, 1280x720)
161 KB
161 KB JPG
>>729065112
Relax bro it's just a random google image vaguely related to the topic

Will this appease your tism
>>
File: 1749551860147086.jpg (118 KB, 567x577)
118 KB
118 KB JPG
>>729064179
1440p on low looks far better than 1080p on high
>>
>>729065281
Depends on PPI
>>
>>729065315
even if you have small pp it still looks better
>>
>>729065327
Not in my experience
>>
>>729064179
I just press x on the icon on my ps5 homescreen and start playing
>>
I'd rather use a real resolution like 4k, instead of a 20 year old res like 1080p, or a 15 year old res like 1440p.
>>
>>729064179
Whats a good graphix card for 1440p?
>>
>>729064179
4K on low settings
>>
>>729065621
rtx4070/5060 or the amd equivalent, basically lower end of mid range
>>
>>729065598
>4k

are you people trolling? do you play in 30 fps?
>>
>>729064476
that's why you get a new GPU when you upgrade to a higher resolution so it doesn't feel like a downgrade
>>
>>729066082
Stop playing unoptimised modern AAA slop, suddenly every game worth playing runs at 240hz.
>>
>>729066152
nta but is 4k support even common
>>
I recently upgraded to a 27" 1440 from a 22" 1080. My GPU heats up faster even with not playing games, which is kind of annoying (I have a 2060), and frankly...it's actually too big as a computer monitor. BUT, I got a VA panel and I notice that my eyes get way less tired from looking at it compared to my previous IPS display.
>>
>get 1440 or 4k monitor
>all your legacy games look like shit or require 3 dubious russian patches to run

T-thanks
>>
File: 20220308144248_1.jpg (852 KB, 3840x2856)
852 KB
852 KB JPG
>>729066223
>is 4k support even common
I don't understand the question. It's a resolution, I've run into one game ever that had an issue at 4k, and that was Uplink, because it made shit too far away so it was a little harder to play.

It's the modern standard of 16:9, so every 3D game works. For games with improper FOV calculation (basically anything that isn't Hor+ and was intended for 4:3 displays), you'll have the same FOV issues you'd have on any other modern screen without a patch or ini edit.

https://www.wsgf.org/article/screen-change explains things better. Now, for better 4k support, the game should offer FOV adjustment, because obviously playing on a larger screen at the same FOV as a smaller one is stupid. Here's an extreme example of what FOV should be set to at 720p and 4k to achieve proper FOV.
>>
>>729066565
>4k monitor
>have to play at 720p anyway
>>
>>729066129
I did do that but I still feel like I'm missing out on frames. I guess that's just because modern games are trash, thoughever.
>>
>>729069365
What gpu are you using anon?
>>
>>729065201
>1280x720px
It ain't gonna appease anyone's 'tism. Post actual resolutions for good comparisons.
>>
>>729064179
You can use DLSS/FSR4 in Quality mode, I think it should look good at 1440p.

I use 4K display and DLSS and FSR4 look good even in Performance mode.
>>
>>729072414
Yeah but it's like playing in 60 fps for 1440p and 100 fps in 1080p
>>
>>729064179
I can't tell the difference between 720p and 4k on my 1366x768 laptop, so it's a meme.
>>
4k is a colossal meme pushed my retards who think they have superman vision.
You also get retards who think they can see 5ms and talk about how they NEED to run a game at 180fps+ when in reality all they need is smooth framepacing, which vsync 60hz does perfectly to the limit of human perception.

Machine measurable =/= hum,an perceptible but we have people who can totally see it I swear honest man yet studies show in blind tests nobody can.
>>
>>729073361
>poor
>can't type != or ≠
>instead chooses equals divided by equals
>>
>>729073361
I can detect a 0.5ms difference so..
>>
>>729073361
>4k is a colossal meme pushed my retards who think they have superman vision.
No, you just don't understand 4k. My 4k monitor is 4x the size of my old 1080p one. It contains 4x more stuff, at the same PPD.
>>
>>729073361
is this a new "human eye can only see 30fps"
>>
>>729073361
>4k is a colossal meme
>DLSS and FSR4 in Performance mode on the latest GPUs have barely any performance impact, so they perform almost as good as native 1080p
>while looking a lot better than native 1080p
>>
>>729073871
>My 4k monitor is 4x the size of my old 1080p one
bro has a fuckin TV on his desktop
>>
>>729074280
not him, but I literally do, it's comfy as fuck
>>
>>729074347
never got the appeal of having a big screen on my desk. Tried it, just hurts my neck as the screen is too tall. Instead, I connect to the TV in my living room via steam link.
>>
>>729074443
Move your screen lower, or your head higher, genius.
>>
>>729074443
I don't have it directly on my desk. It's like 1.5 meters away from where I sit, so I don't have to move my head that much.
>>
>>729074521
What kinda desk/chair combo are you using that allows you to do that with a 40"+ TV?
>>
>>729073493
t. Placebo
>>
File: file.jpg (2.86 MB, 4096x3072)
2.86 MB
2.86 MB JPG
Here's my setup with beer bottles for scale
>>
>>729074224
lol, lmao even you copers will say old shit
>>
>>729074559
Your head should be in the top third of your monitor, and for a 40" screen you should be sat around 2.5 feet away, measured from eye to screen. At this distance you shouldn't have to move your head at all to look at the monitor.
>>
>>729074856
I assume you're >>729074534 ?
Looks pretty comfy to be desu, but I couldn't use my PC like this
>>
You DO sit the correct distance from your screen right anons? I bet most do not.
>https://www.ultraselective.com/blog/optimal-viewing-distance
>>
>>729074674
t. someone who only used 60hz his entire life
>>
>>729075090
Do you also hear at 22khz by chance?
>>
>>729073361
have you actually used a high refresh rate monitor before? it absolutely makes a perceivable difference. if that difference is important to you however is another story.
>>
File: file.png (625 KB, 881x514)
625 KB
625 KB PNG
>>729075036
>minimum viewing distance
>not immersing yourself in the game
>>
>>729075280
>anon is blind AND deaf
zamn
>>
>>729064476
that doesnt stop you from running games at 1080p
>>
>>729077108
based labrador
>>
>>729073361
>Pixels aren't real.
>>
>>729078547
lol, lmao
>https://hothardware.com/news/scientists-claims-4k-and-8k-tvs-arent-noticeably-better-than-hd-to-the-human-eye
>>
>>729075036
This shit is so outdated, it refers to 16:9 as "Widescreen".
>>
>>729076982
>tn shitcan
>immersive
>>
>>729073361
fuck off retard, 60hz is objectively not enough
the only reason it even exists is because of the fucking power grid
>>
>>729079443
what's the marketer shilled panel nowadays? oled? I fell for IPS and while the colours are slightly nicer than TN I guess it has a fuckton more motion blur
>>
>>729064179
In basically all use cases I want one up from the bottom graphical settings and the maximum number of FPSs I can get.
>>
>>729064179
1080p! ᕙ( •̀ ᗜ •́ )ᕗ
>>
>>729079581
oled is the best we can get until microled becomes production viable
right now a 110" microled tv costs $30k so that'll take at least a decade
or you can be a miniled coper and pretend that taking a hammer to the backlight somehow makes ips or va less mediocre
>>
>>729073361
>Machine measurable =/= hum,an perceptible but we have people who can totally see it I swear honest man yet studies show in blind tests nobody can.
Actually line spacing tests in optometry suggest the average human under average conditions can perceive details at around one arc minute. That would equate to around 12,000 across your entire FOV, or for a screen that takes up one third of your FOV, around 4,000 aka 4k. If you have a massive monitor you might want to go higher.

As for refresh rate, I'm not sure about the science of that, but I agree that it really doesn't matter after around 100hz if it's smooth, even though there are perceptible differences beyond that.
>>
>>729064179
4k nigga
I was playing 1080p back in 2009
>>
>>729064179
Whether or not it's worth it to upgrade from a 1080p monitor to 1440p will completely depend on your GPU and how much of an fps hit you will take
>>
>>729064179
If the AA is properly implemented (like no smeared TAA) 1080 @ high all day
>>
>>729079787
What is the minimum GPU for 1440p?
>>
>>729080201
my 3080 is barely hanging on
>>
>>729080474
RTX 3080 is absolutely fine. A 3070 would be fine I'd wager if not hampered by only 8gb VRAM. I have a laptop with a 4090 which is probably not dissimilar in performance from your 3080 and it runs everything fine at ultra settings on the laptop's 1440p panel with DLSS set to quality. I got a portable OLED screen recently with 3k res and even that runs fine with DLSS set to balanced instead.

If you're vehemently against DLSS then you may as well give up on modern games altogether, they are not designed to run properly without it, it is not a matter of hardware.
>>
>>729080201
Every slider must be all the way to the right at all times no exceptions so....a 5090.
>>
>>729064179
If you're not hitting 60 fps, you should lower the resolution/quality, imo. 1440p monitor is worth it for me as I also work on my PC and interfaces are a bit nicer, though at work I'm still on a 1080p (both 27") - the difference is not jarring.
Furthermore it heavily depends on the type of games that you play - in 3D games where camera moves most of the time, it's less noticeable, compared to a strategy game, for example, where your whole screen is filled with 2D UI.
>>
>>729081010
what about a 5060 ti 16gb version?
>>
>>729079443
What downside does TN have other than muh viewing angles and muh 10% worse colors
>>
If it was up to console fag we'd still be at 480/720p xbox360 tier of resolution
>>
>>729083282
>10% worse
lol it looks like absolute shit and it's still demolished by oleds at response times
best flip that question and ask what upsides does tn have
and the only answer to that is being cheap
except not really since those high end zowie tns are as expensive as oleds and there's a 1440p asus monitor out now that can do 720hz at 720p
>>
>>729083282
luv me smearing
>>
>>729084297
>>729083936
TN has highest response wdym. That's why pro gaymers always use em
>>
>>729082010
Would be absolutely fine for 1440p.
>>
>>729083282
Bad blacks with middling whites. Very warm and power inefficient.
>>
>>729084463
The black pixels are physically slower than the others. Even the highest rated TN on RTings has this as an issue.
>>
>>729084463
they use zowie shitcans because they're fucking sponsoring them
it would take you 10 seconds searching up zowie sponsor to find out about this
and no, literally nothing comes close, oleds are in a league of their own
>>
>>729084463
TN response was good like 10 years ago unc.
Others have caught up and exceeded it now.
1ms used to be the gold standard of TN response, we're at 0.03ms now.
>>
>>729064476
That's because you're a retard with a low IQ. Probably still using Windows and nVidia cards because you're a sheep brained NPC
>>
>>729064179
Have you tried not being poor?
>>
>>729084673
>540hz TN has something like 5 times the inverse ghosting of any old 165hz oled
>even when said oled has more pixels to shift
>>
>>729084673
>shills unboxed
>>
>>729086759
>the data that you can easily cross reference is wrong because i'm mad that the most primitive lcd tech is getting destroyed in a benchmark
>>
>>729087235
dude just makes shit up to make companies he likes look good
he likes oleds so oleds mysteriously win all of his monitor review benchmarks and he glosses over their flaws

see also: why they are known as AMD unboxed
>>
>>729087676
cope all you want, you're still denying data you can't disprove because you're upset about tn garbage being made redundant
>>
>>729088038
based flying spaghetti monster believer
>>
>>729088204
feel free to buy that zowie shitcan and compare response times to a bottom of the barrel oled
if you're going to be so contrarian you should have some solid foundations and not just butthurt
>>
>>729064179
2160p on Max Settings
>>
>>729087676
>see also: why they are known as AMD unboxed
I don't deny they had a bias, but that denies the reality that everyone else had an nVidia bias. Because the way the HU bias manifested was by using games and settings that simply WEREN'T favorable to nVidia. Next time you're looking at GPU reviews pause the video on the settings page and look carefully. Notice how everyone picks slightly odd looking choices for "high" and "ultra". Kind of like maybe they noticed that their favorite cards weren't quite leading the charts the way they wanted and would "massage" the settings until they did?
When DX12 and Vulkan appeared on the scene it was outrageous. AMD was ahead of the game on that (because vulkan was basically Mantle and DX12 really suited AMD's compute focussed design. Ironic that nVidia stole the crown with CUDA, but I digress) and all the usual shills were tripping over their own test methodology, desperate to get nvidia back into the lead, citing "unfair advantages" that AMD had and games that were "optimised for AMD" as reasons they had to drop them from the test. So "the way it's meant to be played" games are fine and representative, but the one time AMD wins, it has to be dropped?
>>
File: nv shills.jpg (432 KB, 2560x1440)
432 KB
432 KB JPG
>>729087676
Cope.
>>
>>729088381
oh pray tell how popular ashes of the singularity is
or strange brigade
>>
>>729073361
Higher FPS absolutely makes a huge bloody difference in games where continuously responding to objects that you're trying to track matters. Upgrading from 60FPS to something significantly higher it easily the single more cost effective and meaningful upgrade you can make if your monitor only does that.
>>
File: file.png (3 KB, 104x42)
3 KB
3 KB PNG
>>729064179
nigger
>>
>>729066082
He's playing in 900p with DLSS and thinks he's playing at 4k.
It's unfortunately very common.
>>
>>729064179
I have a 4K 350 hz OLED for games that aren't RT heavy that my 4070ti Super can chew through. For really RT heavy games I use my 1440p panel for great frames on ultra. I think this is the perfect ecosystem
>>
>no one ever says usecase
reminder that higher response times are currently only useful for competitive games and nothing else, where the 1% lows do make a difference.
sure you can make the case single player games are smoother and more enjoyable with higher FPS but most of them are made with consoles (ala locked 30fps) in mind. this will never change until the video game market changes (it wont).
>>
File: no eyes to see.jpg (114 KB, 1650x928)
114 KB
114 KB JPG
>>729064179
Frame rate counts in graphical quality, because it looks better on your eyes
>>
>>729064179
I will forever play 1080p if it means the fps can stay consistent at 90+

>muh 4k
and then the games play like ppt so troons and faggots can take pictures
>>
>>729088914
You are correct.
>>
>>729064179
Id rather play on 720p Ultra with motion blur turned off
because fuck you, my shit never lags, never drops frames, & if that isnt an option then I'm refunding
>>
If I can't max out a game and get good frames on my gtx 1650 then i'm not playing the game. No dev gets my money if they can't make their game run on hardware the majority of people own.
>>
>>729079675
Tests point to the human limit of perception being insanely high, over 330khz.
There's practically no upper limit to how much higher refresh rates will improve a display, considering that the required refresh rate for perception of motion clarity increases almost indefinitely with speed. So no, 100hz isn't enough. Not 240hz. Not even 1000hz is anywhere near enough to meet the ability of human vision.
>>
>>729090837
>t. samsung display rep
>>
>>729084673
GtG is an utterly useless metric.
>>
>>729091107
>the speed at which a monitor changes colors is useless at measuring refresh rate compliance
>>
>>729091732
black to black is the proper way to measure refresh rate. GtG is marketing bullshit at best; it doesn't mean anything.
>>
>>729064476
Just use DLSS to get back to same internal render resolution and get better picture quality for free.
4K is now the best gaming resolution regardless of performance as long as the game supports DLSS, though for desktop usage 1440p is better.
>>
>>729091878
>use a test that makes TN and MiniLED (fancy VA really) look bad
There is a reason why nobody uses such a test.
>>
>>729092137
Yeah, like I said: GtG is marketing bullcrap designed to mislead consumers.
>>
>>729064179
1080p is really ass so I'd probably got with 1440p and lower settings.
>>
>>729091878
So all your games are pure black except for the moving objects? GtG is the most realistic benchmark.
>>
I'm not moving to 4k until I can not only get a flagship card every 2 years but also a 16:10 monitor of that size too
Give me 16:10 or give me death
>>
>>729092262
>t. 2009 coma anon
>>
>>729091878
gray to gray tests how quickly every single color diode switches between being any shade of completely on to completely off and everything in between
black to black tests fucking nothing
>>
>>729092216
>>729092420
GtG only measure response times between mid-tones, which are already by far the fastest a panel can do, so if a display has goot GtG but poor BtB then it'll be smeary dogshit regardless of what its GtG shows. It is a genuine snake-oil measurement designed to mislead.
Monitor companies used to proudly declare their BtB response times right up until response times tanked with cheaper panel technology. The fact you can get "1ms" GtG response time displays with atrocious smearing issues says it all.
>>
>>729092637
it tests literally every possible color change and the results are then averaged
it's objectively correct and your extreme cases are represented too
>>
>>729064825
>4k is a meme, I own a 4090, the second best GPU on the market, and it's not strong enough for comfy 4k.
The main benefit is the vastly improved quality of the screen. Low res screens with big pixels suck. Running 1440p via DLSS on a 4k screen is going to look better than 1440p with DLAA on a 1440p screen, because having more physical pixels to work with improves image quality.

Of course, being able to render native 4k on a 4k panel is obviously preferable if you have the performance headroom for it, but even if you don't it's not really smart to buy low res screens in pursuit of performance. That way of thinking is just ancient wisdom from the days when we ran everything at native res and upscalers were complete and utter dogshit, it's not really applicable anymore with how modern, demanding games render and run.
>>
>>729074051
Yes, new cope from poorfags and / or blind retards. You can look up the acuity limits of human vision, a 27" 4k monitor like 65cm / 2' away from your eyes like on a desk is still below the acuity limit of normal vision and people with above-average eyesight can do even better than that.
>>
>get 4k screen
>play games at a locked 30fps
feels good man
>>
>>729092768
Are you a monitor salesman?
As you're spewing more bullshit than a snake oil salesman.
GtG; the clue is in the name. It measures the gray-to-gray transitions of a display. It does NOT include color transitions, dark transitions, bright transitions, or mixed transitions whatsoever. In fact, GtG was devised specifically to avoid measuring any of these relevant measurements on purpose - nearly all monitors can do gray-to-gray transitions far faster than anything else. Also, only around 40 frame transitions are even measured, so in no way are people testing the entire color range. That could not be more bullshit if you tried.
>>
>>729093216
>frame
*per-pixel
>>
>>729093216
color transitions ARE INCLUDED in gray to gray transitions
it's literally just a faster way of testing for it
say you're going from pure 255 red to pure 255 green
that would at worst be equal to the worse result between going from 0 to 255 and 255 to 0 on the gtg test
>>
>>729064179
>poorfag or poorfag
I'm not poor though. The time for 27inch monitors is long gone bro. They're so cheap only poorfags buy them. Go big or fuck off
>>
>>729064179
4k low seetings
t. 360hz 1440p oled fag
1440p gives me eye cancer
>1080
lmao
lmao even
>>
>>729093464
You're literally making this up.
They DO NOT test red-only/blue-only/green-only (or any color) pixel changes. At all. They flash the pixels gray and change the brightness value and measure the luminance response time. R -> G / G -> B / B -> R / etc. is a completely different test; chromatics are checked separately.
GtG is only checking brightness response times within a constrained range. It is a (flawed) luminance response time measurement.
>>
>>729093904
testing one diode going from 0 to 255 and the other from 255 to 0 is functionally indistinguishable from telling all of them to go from 0 to 255 or 255 to 0
the point of using shades of grey is that it's using all the diodes so you're always getting the worst possible result that exposes the bottlenecks
your suggestion would in fact make it easier to lie
>>
>>729065117
I've got the 5k2k LG monitor and run all the games I play fine with a 5080. Diablo 4, PoE2, etc. PoE 2 actually looks and runs better using DLSR to simulate quasi-8k and then putting the in-game DLSS on performance.

Some weird cope in this thread from guys stuck on 1080p monitors or who exclusively play competitive shooters in which all you care about is 200+ fps.
>>
>>729094183
>shades of grey
>using all the diodes
Bull-fucking-shit. That is not how it works.
Grays use equal values for R/G/B, with all subpixels rising by the same amount between luminance checks. This is why a monitor with "excellent" GtG response times can still have poor ghosting of specific color channels as GtG does not measure independent chromatic changes of a display's subpixels.
>your suggestion would in fact make it easier to lie
It's not a suggestion. It's what they actually do for a "GtG test". That's why they are hot bullshit and tell you basically nothing a tall about a display's response times in real-world usage at all. It's fucking snake oil.
>>
>>729093154
She is fast.
>>
>>729092839
So you enjoy playing games in 30 fps huh
>>
File: gta3_2021.12.18-11.37.jpg (1.36 MB, 3840x2160)
1.36 MB
1.36 MB JPG
>>729094851
>when you use your 540hz tn for the first time
>>
File: 1740577625592468.jpg (136 KB, 400x384)
136 KB
136 KB JPG
>>729074224
>4k mustard race!!!!111
>dlss on
>>
File: 1716726670294180.webm (3.6 MB, 640x360)
3.6 MB
3.6 MB WEBM
>>729094976
>When you forget to cap your FPS
>>
>>729094647
>Bull-fucking-shit. That is not how it works.
that's literally how it works, do you not know what hex color values are?
>This is why a monitor with "excellent" GtG response times can still have poor ghosting of specific color channels as GtG does not measure independent chromatic changes of a display's subpixels.
name the monitor then
>>
>>729093904
You are right in that the speeds of the subpixels aren't tested separately, but the GtG still represents the average of them. So GtG only tests average clarity, not potentially disproportionate smearing of one color. I guess giving response time tables for every subpixel might be useful. Though I rarely see that in motion tests either. Other smearing like black smearing you can literally see from the table, which includes tests from and to BLACK from every value. You can't see black smear from the averaged GtG number and you shouldn't base your monitor purchasing decision on that one single number. I guess you could use some kind of different average that amplifies the extremes to favor flat performance. Or calculate a deviation number in addition to the average.
Your R -> G / G -> B / B -> R tests are completely unnecessary if you tested each color separately. Unless you can provide evidence that the the different subpixels transitions somehow affect each other, which could be worth ruling out, I doubt it though.
>>
>try to play games older 1080p games on 1440p
>waste hours on crashes and downloading .bat patches from Russian hackers to get widescreen support

lole
>>
>>729095052
Better than native.
>>
>>729095232
>1080p game
>no widescreen support
uh-huh, any other nuggets of wisdom anon?
>>
File: 1762093586355331.png (3.97 MB, 2560x1440)
3.97 MB
3.97 MB PNG
>>729095292
It is. (this is in motion btw)
>>
File: 1709054142233926.jpg (721 KB, 1600x1066)
721 KB
721 KB JPG
>When you realize your old 17" CRT had better PPI than your current 27" 1440p LCD
>>
>>729095590
>Anons are hating on DLSS
>In reality they should hate TAA
>>
>dude, you don't need to see your HUD
>>
>>729095810
The only people who hate DLSS are salty AMD users who are stuck with FSR which is STILL worse than dlss 1.0.
>>
>>729095354
some don't go beyond 1080p brainlet

if you want to play Bamham or SoM or games of that era you have to fuck with the config file and hope it works

even then the UI will probably look like shit
>>
>>729095193
>that's literally how it works
No, it isn't.
In a GtG test, a pixel is set to one gray level (equal R/G/B), then switched directly to another gray level. A photodiode measures how long the luminance takes to move between 10% and 90% of the final value. This measures only luminance transitions, NOT independent subpixel or color transitions in any way.
>name the monitor then
A good amount of VA panels can achieve 1ms GtG yet have noticeable smearing (usually blacks, sometimes purples or other colors)
>Samsung Odyssey G5
>AOC C27G1
>MSI Optix G27C4
(etc.)

>>729095224
GtG is NOT a measurement of average subpixel speeds, it is a measurement of ONLY the combined luminance curve (NOT individual R/G/B curves), a photodiode used to measure GtG response can't even detect color channels, so the idea that it's in any way a means of measuring color response times was wholly incorrect. GtG is only measuring luminance. I cannot repeat this any more than I already have.
>Unless you can provide evidence that the the different subpixels transitions somehow affect each other
I'm wasn't saying they affect each other, I'm saying that for a color response time test you'd need to use different sub-pixel channel values (e.g.; instead of [R-G-B] 40-40-40 -> 80-80-80, test 40-0-0 -> 0-0-40 as an example of R -> B color transition testing)*
*this is an oversimplification of how the test is carried out, but you get the point
HOWEVER!
They DO, in fact, affect each other. But this post is already getting too long to give you a thesis about LCD technology as it is.

GtG deliberately HIDES response time bottlenecks, rather than expose them as you seem to believe, because all subpixel move in equal measure to each other for each transition tested.

I'm sorry, but you've been fucking swindled by these display panel makers if you believe otherwise. If you want to understand the response time of a display, GtG is the WORST way to measure it in just about every way.
>>
The important part about any game is that it fucking works. I don't give a shit about the graphics if the game is actually fun.
>>
>>729096393
So you're telling me a number of old 1080p era games have no widescreen support?
>>
>>729073435

Touch grass
>>
>>729096539
all those monitors absolutely eat fucking shit in benchmarks
don't tell me that all this time you were dumb enough that you were talking about what's on the fucking box and not benchmarks
>>
>>729096745
they don't have 1440p support
>>
>>729084779
Lol because AMD cards are better? Fuck off poor fag
>>
>>729073361

Get your eyes checked.

4K is only a meme because the strongest cards on the market can barely push 4K at a decent framerate.
>>
>>729096968
That wasn't the original statement. The claim was;

>waste hours on crashes and downloading .bat patches from Russian hackers to get widescreen support
You might need to brush up on what widescreen means.
>>
>>729096857
As far as I'm concerned, their GtG benchmarks are as made up as the figures on the box, the only difference is the manufacturer can cherry-pick the best range of values for their own tests while a benchmark should be using consistent-[ish] values on each display as indepent tests, but it is still a massively misleading way to measure response times of a display. GtG is in no way a reflection of how a display will be like for the user experience in the real world. BtB is a better measurement, but ideally you'd get full transition table benchmarks, MPRT, PCT, and color transition testing. >>729092768 is okay. >>729084673 is useless.

GtG is plain misleading. Do you even know yourself what values reviewers tested with? Did they disclose this clearly themselves? Even if they do, GtG has so little relevance to a display's typical response time that it is not worth looking at.
>>
>>729097059
We've known 1080p is the limit outside of truly mega screens (like in a cinema) for the last 20 years anon.
>>
>>729097393
an average is still an average, any benefits it gives it also takes away, that's literally what an average is
even if we use the worst value on the gtg table
according oleds still absolutely fucking shred tns
an oled can portray up to 1250hz without any issues while a tn gets a pathetic score of 160hz with that atrocious 6ms time
if anything measuring it by the average HELPS non oleds and they still tear every other monitor apart
>>
>>729098051
why even post the garbage GtG chart when these much better charts exist?
worse than graphics cards' benchmarks I swear
>>
>>729092839
Nah bro DLAA 1440p looks better than Perf 4k
And going for a 4k screen basically locks you out of high framerate gameplay unless you want to lower you res and fuck up you're PPI if you're on a 32 inches
whereas 1440p is guaranteed high framerate in everything with the added benefit of being able to turn DLDSR on to run the game internally at 4k which give you an image quality pretty close to the real thing even if it's not the same
Anyway, comfortable 4k isn't even possible yet on demanding games. 6090 will be the first 4k card if it's at least 50% better than a 5090
>>
>>729098530
because most people know what an average means and would rather compare those between multiple monitors
>>
File: 1676772579128522.jpg (151 KB, 1024x905)
151 KB
151 KB JPG
>>729064179
I play at 1080p on highest settings but limit my FPS to either 60 or 90 depending on the game.
>>
>>729098832
average of WHAT though?
There's no values specified that were tested for in the chart. Again, it's a useless chart without the numbers. Which GtG transitions were used?
>>
>>729064476
just use fake frames
>>
>>729099119
at this point you're just fucking baiting, this test is good enough for anyone that isn't a semantics arguing midwit
call back to the original point, this absolutely does prove beyond a shadow of a doubt that oleds are stomping every other monitor tech, tn isn't actually capable of displaying high refresh rates due to the retardedly high overshoot and va and ips are all still copes compared to oled
>>
>>729099828
The graph itself is bait
>>
>>729097607

If youre 1440-phobic
>>
>>729095232
Just run integer scaled 4:3 like God intended.
>>
>>729099912
do you need them to include the minimum, mean and maximum like it's a fucking math class to stop nerding out about shit that doesn't matter?
everyone understands what an average is in this context, stop crying
if they want to buy that specific monitor they can look up its individual graphs, the average absolutely does carry enough information to take a fat shit on shit monitors
i'm out
>>
File: file.png (159 KB, 700x700)
159 KB
159 KB PNG
>do you need them to include the minimum, mean and maximum like it's a fucking math class
>>
>>729098723
>Nah bro DLAA 1440p looks better than Perf 4k
not him but it quite clearly doesnt
>>
File: 1766213056355680.jpg (173 KB, 1536x864)
173 KB
173 KB JPG
>>729064179
I recently spent 1.2k on a 4k oled monitor. I realize now, I should have spent the same amount on a monitor that maxes out the Hz. Aim for a good 600Hz monitor brokies.
>>
>>729101942
>needs 600hz of TN to get the same motion clarity as 240hz oled
>>
>>729064825
>4k is a meme, I own a 4090, the second best GPU on the market, and it's not strong enough for comfy 4k.
The vast majority of people using 4k monitors don't play native 4k, they use fake 4k (via DLSS upscaling) which is still much better overall than 1440p. My fake 4k looks like ~1800p while running at ~1300p framerates. The only people who assume that you HAVE to play at native 4k to get good image quality are people still at 1080p/1440p.
>>
>>729102124
>fake 4k (via DLSS upscaling) which is still much better overall than 1440p
DLAA 1440p native > 4k DLSS
>>
>>729101942
i have a 500hz oled and there's really no improvement over running it at 250hz with bfi
the clarity improvement isn't there when you're looking at something more than white and black lines moving across the screen
i'll hold onto it until they make a 4k 500hz ultrawide
>>
>>729102249
>DLAA 1440p native > 4k DLSS
I don't think so. I have 4k and 1440p monitors side by side on my screen right now and the only scenario in which DLAA 1440p native looks better is if the 4K DLSS uses some really low input percentage like 35%. 4K DLSS just looks way more stable and clear.
>>
>>729102329
what monitors?
>>
>>729102451
ASUS Rog Strix XG27UCG is the 4k one, AOC Q27G3XMN is the 1440p one. They're both 27 inchers so it's possible that the high pixel density of the 4k screen is making DLSS artifacts hard to see, and if the 4k screen was a lot bigger I would see more artifacts. But as the kids would say that is NOT MY PROBLEM.
>>
>>729064179
because of the nature of real time rendering videogames disproportionately benefit from increased resolution, uncompressed 1080p video looks leagues better than the equivalent game footage .
>>
>>729097187
It doesn't have full widescreen support if it doesn't include 1440p

hylic
>>
>>729064756
>With modern games(TAA), a higher res does more to the visual quality then settings. Most games today also leverage uspcaling and 4k dlss looks better then native 1440p and a hell of a lot better then native 1080p with a higher fps then native 1440p, that at this point, if you play modern games, you are better off with a 4k monitor

Can someone elaborate on this? I'm still running on the "always pick your native resolution with 144+ fps" oldfag advice

How do modern games do it? Do you unironically think 1440p/4k on 30-60fps feels good?
>>
File: sohWhy9.jpg.png (458 KB, 1731x1506)
458 KB
458 KB PNG
>>729102980
>game doesn't support meme resolution
>>
File: 1391986390231.jpg (329 KB, 1440x1080)
329 KB
329 KB JPG
>>729064179
>clipped image
>of a video
>580x433
>>
>>729104075
>Can someone elaborate on this?
Modern 3D games, especially all the graphically intense ones, are built around rendering methods that degrade in quality the less pixels you have to work with, the main rendering method being TAA. TAA looks perfect at 8k but looks garbage at 720p, even though 720p used to look fine before. A part of why 720p no longer looks fine isn't just TAA but because there's a LOT more detail in games now, and the effort required to make all that detail look good and stable has to be a lot more aggressive nowadays. Imagine if your shirt only has small stains and you can just wash it with soap, but if your shirt has serious greasy stains you need a much harsher detergent that slightly degrades your shirt's fabric over time.
>I'm still running on the "always pick your native resolution with 144+ fps" oldfag advice
Things changed with how much upscaling technology has advanced. It used to be just a shitty cope and gimmick to be sure but at this point FSR4/DLSS (the respective main upscaling technology of the recent AMD/Nvidia cards) has gotten very, very good at faking higher resolutions, especially because they're more advanced and thorough than TAA. And the higher their target resolution, the better their results are. Hence why people don't really recommend FSR4/DLSS at 1080p because it's an obvious tradeoff (you get better performance for slightly worse image quality) but at 4k it's really recommended because the image quality degradation is much harder to notice.
Nowadays if you have a modern GPU and want a better balance between image quality and performance, you're better off upscaling to a much higher resolution than you are playing at a lower native resolution. Of course that only makes sense if your GPU has the power, for example if your GPU struggles with native 1440p then it'll struggle trying to do 4K DLSS as well.
>>
>>729104474
hey fuck you that is massive my on 1366x768 screen
>>
>>729075036
>buy bigger screen
>only to sit further away

What's even the point?
>>
>>729066468
>what is integer scaling
>>
>no money for new pc
>have to play on my 5 year old laptop
>4k OLED display
>can't play anything newer than 2-3 years old on 4k

The newest stuff still looks good on high/highest settings on 1080p...with 30-60fps on my 15 inch screen. I wish I'd switched to desktop again a couple of years ago. I would also have to get a monitor and don't want to downgrade from 4k oled



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.