In 2025, How is 4k gaming compared to 2k?
Posted by ShadiestOfJeff@reddit | buildapc | View on Reddit | 563 comments
I have a old monitor that a shilled cash for back in the day when the 2070 super came out that is a 1440p 120HZ g sync TN monitor and since upgrading my PC to a 9070XT and a 9800x3d and I'm wondering how far did technology go for 4k gaming to be viable and if its a reasonable step to take for my current system.
farmeunit@reddit
It's totally viable but think 3440x1440p is ideal. Better performance. If higher refresh rates aren't necessarily a big priority, it will be fine at 4k. I just don't think the visual difference is that much compared to 1080p to 1440p.
DEPRzh@reddit
4k gaming was fine 4 years ago. Now it's basically unfeasible since the performance of new games are deteriorating 100x faster than gpu upgrading.
Wander715@reddit
4K is totally fine as long as you use DLSS. Currently using a 4070 Ti Super and can play basically anything at 4K. I've even used it for pathtracing in Cyberpunk and AW2 although I have to heavily use DLSS and frame gen for a good experience.
skylinestar1986@reddit
Basically anything at what framerate?
Wander715@reddit
With DLSS in AAA titles I usually get 100+ fps. Great smooth experience with either DLSS Quality or Balanced, which now with the transformer model basically looks like native quality to me, I'd be hard pressed to tell a difference.
In heavy titles like Cyberpunk, AW2, and Wukong with pathtracing on I use DLSS Performance and frame gen and get somewhere around 70-80fps with base framerates around 50-55 at their worst. Still a very good experience with Reflex.
FlorpyDorpinator@reddit
I have a 4070 ti super, where can I learn these OC techniques?
Wander715@reddit
What model do you have? When I bought one I specifically opted for one that had a raised power limit out of the box because I knew I'd want to do some overclocking, went with a Gigabyte Gaming OC.
I'm just using MSI Afterburner, nothing fancy. Have power limit set to 112% and core clock at +200MHz, memory clock at +1500MHz which is the same memory speed as 4080 Super.
I get a noticeable bump in performance in most games, I've done direct comparisons changing the OC in game with Afterburner. Again usually somewhere in the ballpark of 10-12%. If you don't have a card that can raise power limit past 285W don't expect to be able to get a stable OC that high unless you really won the silicon lottery with your chip.
rodmedic82@reddit
Are you undervolting as well? I just recently got into OC’ing my Gpu and have been messing with it a bit, still leering.
Wander715@reddit
Undervolting would be lowering the power limit, so no if anything this is "overvolting". Undervolting is typically beneficial for efficiency, cooler operation, and slower fans speeds. Raising the power limit will allow you to squeeze as much performance as possible out of the GPU.
The cooler on my card is a beast so even with the raised power limit my temps aren't too bad. At 320W sustained load the highest temps I've seen are like 68-70C. But that's why it's important to buy a card designed with a higher power limit in mind.
Gastronomicus@reddit
No - undervolting is unrelated to the power limit. By reducing voltage, it reduces power use for the same operations, but has no effect on power limit itself. It allows you to potentially increase performance within your power limit.
Wander715@reddit
There's a couple ways to undervolt. One is setting a custom VF curve, another would be to set power limit to something like 90% which would effectively cap the voltage lower. So yes power limit is indirectly related to undervolting.
VoidingSounds@reddit
No, this is wrong and Gastronomicus is correct.
If you drop the power limit the VF curve will be unchanged. You will still be at the same voltage at a given clock, and you will run out of room to boost sooner.
If you're not adjusting the VF curve (downwards) or applying a negative offset you're not undervolting.
Wander715@reddit
In that case maybe I'm just thinking about undervolting wrong. To me it basically means "cap your chip's voltage in some way" whether that be doing a fine grain control over the VF curve or just limiting the power.
If you limit the power then yes the GPU clock won't boost as high and the max voltage the chip is rated for won't be hit, but yes along the way the VF curve won't be altered.
Gastronomicus@reddit
Think about it this way: power is the product of electrical current (amperage) and voltage. So if you reduce voltage but keep the amperage the same, you reduce the amount of power consumption (and therefore heat production).
The GPU (or CPU) is programmed to draw power according to pre-determined voltage and current settings that balance stability of operations with efficiency. To prevent overheating and exceeding maximum safe current and voltage levels, there are limits to voltage, current, and power. Setting a power limit therefore means limiting voltage and current. It caps the maximum allowed voltage and current, and as such, overall power draw.
This isn't the same as undervolting, which means reducing the voltage level for the same number of operations. There is a range of stability with voltage, and better silicon can run more stable at lower voltages, presumably due to better quality traces and lower resistance in the materials in the boards. That said, recent GPUs and CPUs seem more tweaked out the door to squeeze as much out of the silicon as possible, and undervolting is often less productive than it once was it seems.
Wander715@reddit
I have a degree in EE you don't need to explain voltage and current to me lol. I was mostly just misunderstanding what people considered undervolting.
Now that I think about it I agree just limiting the max voltage the chip will hit under operation is not really undervolting.
Gastronomicus@reddit
Gotcha! You'd definitely understand then that a lot of people who tweak their cards don't seem to know these basics, so I thought I'd start there. Glad to see we're on the same page.
VoidingSounds@reddit
Yeah, that's a bad way to think about that and conflates two related but in many ways opposite behaviors.
When you undervolt, you are intentionally lowering the voltage at a given clock/across the VF curve. In overclocking/tuning, that leads to a virtuous cycle where less heat is generated and you give your GPU additional thermal and power headroom and hopefully that translates to higher frequencies at safe voltages and more FPS.
If you limit the power you just boost less. Yes, you GPU will see a lower core voltage but it's a consequence of you not being able to move as far right along the stock VF curve.
VoidingSounds@reddit
That is correct.
cTreK-421@reddit
MSI afterburner is a good program and research safe overclock levels for your particular card
MathematicianFar6725@reddit
Not sure why you're downvoted, I've been playing in 4k on a 4070ti and DLSS makes it possible to get 90-100fps in a lot of modern games. Especially now that DLSS balanced (and even performance) can look so good now with the new transformer model
fmjintervention@reddit
People get upset if you say anything good about DLSS or frame gen, benefits they're Nvidia exclusive tech and people don't like Nvidia at the moment. It's fair to not like Nvidia's very anti-consumer business practice, but it's hard to deny that DLSS/frame gen/Nvidia's RT implementation are very powerful tech and only get better when you use them all in combination. A 4070 Ti Super running 4K games at good visual settings at 80-100fps? Sign me the fuck up
Long_Supermarket2047@reddit
Inbefore unrelated wall of Text: They said they upgraded to an AMD GPU so DLSS and FrameGen are barely relevant to their question...
Well, that... and because you aren't actually playing in 4k, so what's really the benefit here?
Like... I'm not saying anybody here said anything wrong (except, missing to answer OP's question related to his actual case I guess) but would it really make sense to spend like 3 to 4 times as much money on a 4k monitor just to then ...not actually play games at the native res? I personally would much rather play at native 1440p on a really good looking HRR Monitor instead of a "just passable" rando 4k monitor.
I guess if you were to take money out of the equation then... hell yea! Go for it. Get a really good looking 4k HRR monitor and at least a 5080 to go along with it and you're golden.
I do have a 4k 120hz capable TV btw so I'm not just talking out my ass (like I tend to do sometimes anyway) but instead talking from actual personal experience.
So yeah DLSS and Frame Gen (and FSR for that matter) do net you enough performance to not need to rob a bank for a decent enough GPU (and they do make the game look better instead of just lowering the resolution by a long shot, don't get me wrong) but I just don't know why you would "set out to use it" Instead of using it because it's necessary to get a smooth gaming experience which is how I view these technologies personally.
For reference: I play on a really nice 1440p 240hz monitor with a Rx 7900 XTX and my TV has a HTPC with my old RTX 2080TI connected to it.
Oh and on a Sidenote for all the DLSS + FrameGen haters... This card can still manage passable frames on my 4k TV thanks to those technologies, which I think is damn impressive. Try running a current title with native 4k on that thing and go enjoy that slideshow...
(I hate Nvidia too. So, no I'm not a fanboy either...)
laffer1@reddit
It’s not that. Just stay at 2k if you want low res anyway. No point in dlss downgrade tech then.
fmjintervention@reddit
All you're saying here is that you have no idea what DLSS is or how it works
laffer1@reddit
It renders at low res and upscales. I know what it does.
fmjintervention@reddit
Yep, so you get the performance benefits of running the game at a lower resolution with the quality benefits of a higher resolution. I'm glad we agree :)
laffer1@reddit
With the screen artifacts, blurry areas and other problems with dlss downgrade tech.
cosaboladh@reddit
I also think it pisses some people off that folks are still running last gen (or older) cards, and enjoying it. I got downvoted just for saying I'm running a 3060Ti, without any problems. Aside from, of course, an inability to run modern titles at max settings.
Am I getting the absolute most out of the games I play? Of course not. Am I having a good time anyway? Absolutely, and I didn't pay $600 for a really disappointing product this year.
MathematicianFar6725@reddit
Also watching 1080p movie streams upscaled to 4k and converted to HDR, it's just incredible. It's a 3GB file that looks indistinguishable from a 30GB file!
PsyOmega@reddit
I can play that 30gb file back at 10 watts. If i use the AI magic stuff on it my GPU is using 250w to render that 3gb file.
That will add up to a rather large electrical bill over the run of an entire series etc.
MathematicianFar6725@reddit
I hope you don't play any of those "video games" either, think of the electricity bill!
But no, my country doesn't have the best internet so it's nice being able to play streams with zero buffering.
PsyOmega@reddit
The difference is, i accept that a video game will cost 200w.
I game maybe 4 hours a week but watch 10-20 hours a week of video content.
Given that balance, I'd rather render video at 10w
MathematicianFar6725@reddit
Suit yourself, for me it's the other way around. I spend most of the day gaming and then watch a movie or episode of a tv show before bed, it's really not a big deal.
There are entire farms of these "space heaters" running 24/7 to mine imaginary coins after all
BasonPiano@reddit
DLSS in and of itself is amazing I think. But it's being used as a tool to avoid optimizing games it seems.
PsyOmega@reddit
Game dev here, some devs do that, sure. But the real problem is that rendering demands are getting more intense in the chase for photo-realism. Every layer of a PBR texture, every ray bounce, etc, has frame time cost.
awr90@reddit
Genuinely curious why games today have these crazy rendering demands, huge storage requirements, and outside of using RT, they look no better than The division 1 and 2 that came out in 2016, or Red dead redemption 2 in 2018?
Xtakergaming@reddit
I believe some games can greatly benefit from ray tracing and others cant,
cyberpunks environment look really cool with ray tracing thank to it lighting and city light.
Red dead redemption/oblivion remastered on the other hand wouldn’t make great use of RT in a meaningful way other than reflection imo
games with open environment make better use of RASTER whereas city environments would benefit from RT.
I can justify the performance loss in gta5 and cyberpunk but not oblivion, ect
seecat46@reddit
Hello, do you work with UR5? Is there a particular reason all UR5 games run like crisis?
immaZebrah@reddit
I mean AMD FSR is good too, no?
FunCalligrapher3979@reddit
Only FSR4
PsyOmega@reddit
FSR3 is great if you're outputting 4K
FunCalligrapher3979@reddit
I disagree. Even at 4k the quality mode is a very noticeable downgrade in image quality, so much so that DLSS in performance mode looks leagues above FSR 2/3 in quality mode.
TheGreatWalk@reddit
I definitely don't like how games aren't able to perform well at native rendering anymore. And that's directly a result of dlss and frame Gen, companies realized they can completely skimp out on optimization by basically lying about performance since they give benchmarks with those things enabled, which gives a terrible experience.
Those techs are great if the game can reach acceptable performance standards in native resolutions, as something you can use to get additional performance, but not to reach those performance standards in the first place.
Especially frame Gen. The input latency that results from using frame Gen is awful, so it should never ever ever be included in benchmarks. And in my experience, dlss doesn't actually give better performance to begin with. Ive been mostly playing the finals lately, and whether I'm using native rendering (no aa), or dlss, my performance is roughly the same, while being gpu bound 9800x3D and 3090) results in being gpu bottlenecked in that game!
PsyOmega@reddit
Radeon has FSR4 and frame gen now too.
JoshuatTheFool@reddit
My issue is that people are so happy to use it that gaming companies are starting to trust people will use it. It should be a tool that's available for certain people/scenarios, not the rule
UndeadCaesar@reddit
I don't get this part, DLSS is "deep learning super sampling", so isn't it generating frames using AI? Or not rendering at 4K and then using super sampling to make it appear 4K? Not every pixel is rendered "for real" which to me says frame gen.
TheCheshireCody@reddit
Whether those frames & pixels are rendered by the game engine itself or DLSS taking cues from the game engine is as good as irrelevant to the output PQ.
Zuokula@reddit
Because in 4K you lose more quality by downgrading settings than the 4K will give you even with DLSS. 4K cuts FPS in half vs 1440p. DLSS puts it back. You saying you run the AAA titles on max/high/ultra 120fps with 4070ti? Bollox.
MathematicianFar6725@reddit
You can run Cyberpunk at 4k/120fps at max settings (excluding ray/path tracing) using DLSS balanced. Hell, you can throw in medium ray tracing and still get 100+.
This is some real pathetic shit, you guys are really in here stomping your feet in a tantrum because people are enjoying video games at a high resolution? Grow up
Zuokula@reddit
Exactly. You would have double the FPS to play with on 1440p. Allowing heavier ray tracing / path tracing with optimization. Which would bring your image quality way above your 4K.
TonkabaDonka1@reddit
Because any game can be played at 4k simply by turning the graphics down. Running 4k on balanced DLAA basically defeats the purpose of 4k. You might as well drop to a 2k monitor or 1080 and increase DLAA to native to get the same sharpness.
Tigerssi@reddit
People don't understand that the 4k performance upscaling has higher pixel baseline, being 1080p than 1440p, with its 960p
KillEvilThings@reddit
What's your clockrates? I'm pushing 2940 peak but on stock power limits with perfect cooling. On power maxed games I'm generally hitting 2880 due to inability to push power to maintain higher wattage.
SolomonG@reddit
If you're getting 70-80 fps in Cyberpunk with DLSS performance and frame gen either that 4080 super OC is not near a 5080 or you are bottlenecking it with your CPU.
I'm getting 120-130 FPS with DLSS performance and 2x frame gen from a 5070ti. Settings are otherwise the second highest ray tracing preset.
AShamAndALie@reddit
Yeah, I wouldnt consider 70 fps with DLSS Perf and FG on a good experience but thats me.
Jasond777@reddit
That’s only in the most demanding games though like Cyberpunk
AShamAndALie@reddit
Yeah but games are only getting more demanding, in most cases DLSS Quality + RT is a no go unless you got a 5090. Im playing at 1440p with my 5080, I just dont wanna deal with having to lower settings all the time.
PoopReddditConverter@reddit
Have you actually gamed in 4k on your own setup? Realtime ray tracing is brand new as far as hardware goes. And I promise adjusting your settings is much more of a problem in your head than it is in real life.
AShamAndALie@reddit
I have. I had a 3090 and used my 4k TV exclusively and thats what made me downgrade to a 1440p monitor. Now I can play Cyberpunk with Path Tracing, RR, DLSS-Q and FG x2 at 140-150 fps with the 5080 so I have no intention to go back to 4k, but I do play older games on it if I dont need the extra hz, something slower like Life is Strange games.
PoopReddditConverter@reddit
Unfort. Cyberpunk is certainly a special case when it comes to graphics implementations but I get the idea. On my 4090 most everything is plug and play.
AShamAndALie@reddit
Cyberpunk, Wukong, now Alan Wake 2, every game that adds Path Tracing will be the same story. Even a 4090 with 4k DLSS Quality and Path Tracing delivers sub 60 fps making it not ideal to activate FG. You'd have to use DLSS Perf + FG to make Alan Wake 2 4k + PT playable.
Id only confidently play at 4k only and let go of my 1440p monitor with a 5090.
PoopReddditConverter@reddit
Brother, we essentially JUST got realtime Ray tracing. It’s not that 4k hardware can’t keep up, it’s that the bar has been raised outside of the observable universe.
But me, I’ll take 4k and RT or even no RT just because I value resolution more than obscene lighting accuracy.
AShamAndALie@reddit
Of course, some people would even rather play at 4k low vs 1080p ultra. Im not one of them. 1440p at 27" looks great. Most people claiming their 4k screens look so good have 55-65 inches TVs with half the pixels per inch so yeah.
PoopReddditConverter@reddit
Not in this household lol. We keep the ppi high round these parts haha. I’m glad you found a setup you enjoy, though. Might have to redownload 2077 tonight and give the lil jewel a stress test.
AShamAndALie@reddit
I did get my 4k TV only 43" so it'd have around the same PPI as the 1440p monitor xD 104 vs 109 I think.
FeralSparky@reddit
Just wish the frame gen didn't look like total shit with ghosting
moonski@reddit
DLSS performance looks like dogshit though
ulixForReal@reddit
That would be ultra performance which you really shouldn't use. Use quality or balanced.
ChipProfessional1165@reddit
Yes you can. Because the difference between native and 4k performance is almost negligible. Speds I swear.
Prestigious-Walk-233@reddit
Don't render at such low quality set it to 1440 min I prefer 1800 to upscale
moonski@reddit
ok? but that's not what the guy replying to me was saying so good for you
Prestigious-Walk-233@reddit
Just because he has it set to performance doesn't mean it's in dog water p. Y'all really wonder why games look like shit ,did you know you can run perf mode in higher res still. They didn't mention what res they use goofy. So it's perfectly valid info to share just say you don't fully understand different upscalers and framegen technology
beirch@reddit
Nope performance mode at 4K looks markedly better than at 1440p or 1080p. I play on a TV and I'd take performance mode 4K over native 1440p any day.
Although with a 9070 XT I rarely have to use performance mode. High settings and quality mode is usually enough for a great experience, and it also looks 10x better than ultra native at 1440p.
Nektosib@reddit
I’m at 5070ti 1440p getting lesser framerates than you at 4k with 4070ti guess we’re playing different games
Tunir007@reddit
He probably has framegen on or he’s just lying lmao
moonski@reddit
Frame gen and dlss performance. Hardly going to look great
scylk2@reddit
He literally said that he uses framegen U dumbass
Tunir007@reddit
Mb, was doomscrolling. Then his results can be absolutely legit.
FlorpyDorpinator@reddit
Teehee
moonski@reddit
He's using dlas quality which I'm not sure how anyone thinks is acceptable
Wander715@reddit
You aren't managing 80-100fps at 1440p with DLSS on? Something is wrong with your 5070 Ti then.
Nektosib@reddit
All newer and some older games are around 60fps with pt and dlss quality. Check cp2077, new doom, wukong etc etc
Wander715@reddit
I'm using DLSS Performance and frame gen for pathtracing games, base framerate is around 50-60, seems comparable tbh although it's hard to tell since you're at 1440p. Haven't tried out the new Doom yet though.
Early-Somewhere-2198@reddit
Interesting. You are getting only about 5-8 fps more than I am getting with a 4070ti. Guess my pny is pushing hard.
Goolsby@reddit
Anything fps above 30 is fine, any resolution below 4k is not.
scylk2@reddit
It's hilarious the amount of replies from people who obviously don't play in 4k
C_umputer@reddit
I honestly find it hard to believe 4070 ti super can handle 4k, or that it's close to 4080, unless the guy is running some insane OC with barely stable performance.
Jasond777@reddit
Then you’re seriously underestimating dlss
C_umputer@reddit
That is the point of the discussion, if you use upscaling that's not 4k, is it?
zouxlol@reddit
Uhh, yes? The output resolution is at 4K. Just because the image is modified doesn't mean it's no longer 4K. Does anti-aliasing making fake pixels all over your screen change the resolution to you somehow too?
C_umputer@reddit
Did Nvidia pay you to write that? Of course, it's not 4k, that's why we always specify 4k native and 4k upscaled.
Jasond777@reddit
Because it still looks like 4k which is a lot better than native 1440. I’m convinced most of the 4k haters have never seen dlss quality on an Oled.
C_umputer@reddit
It does look fine, but the discussion is about gpu being able to render 4k natively
cbizzle31@reddit
I play 4k on a 3090. People on this sub like to act like 4k will just make your computer explode.
It doesn't, in triple a games I target 60fps and I hit it in every single game and it's an amazing experience.
What's ever been is the vast majority of games I play aren't triple a. They are smaller indie titles and e sports games. They hit 120 on my 3090 with ease.
A 4070 can easily provide a good gaming experience in 4k.
C_umputer@reddit
I've got 3090 too mate, yes I can do 4k but obviously I have to either turn down graphics, use upscaling or not target high fps.
cbizzle31@reddit
That's a far cry from "running some insane OC with barely stable performance."
C_umputer@reddit
That proves my point, 3090 is still a beast, but the games have some insane requirements nowadays. So 4k becomes harder to achieve.
PoopReddditConverter@reddit
4K is not becoming harder to achieve, it’s (with the leap from 3090Ti-> 4090-> 5090) become easier.
C_umputer@reddit
It is becoming harder to achieve, new games are more demanding, and even older games get updates.
PoopReddditConverter@reddit
No, the bar just raised. 8 years ago we were struggling for 4k60fps. 144Hz is achievable in nearly every game with flagship hardware. Save for a handful of BRAND NEW titles, with or without upscaling.
C_umputer@reddit
Again, I'm not saying 3090 and 4070 are weak, I'm using one myself. But in new demanding titles at native 4k both of them start to struggle and need some compromises, that has been the point of the discussion from the beginning, idk what so hard to understand.
PoopReddditConverter@reddit
Maybe it’s you who’s not understanding. The. Bar. Has. Raised. We’ve had games stressing the best of the best cards since people were playing games on computers. Mfs talking about path tracing and more intensive implementations of realtime attracting. It makes no sense to make that claim because of a handful of unoptimized new games with brand new technologies.
C_umputer@reddit
Which is literally what I've been saying since the beginning, idk what are you even trying to argue.
PoopReddditConverter@reddit
Me neither tbh
cbizzle31@reddit
I was going to make this point but tried to stay on topic as to not further confuse the guy.
The fact that a 5090 can play all these games at 4k high refresh rates is the exception. I can't remember the last time a current gen card wasn't absolutely kneecapped by the highest fidelity graphics games.
PoopReddditConverter@reddit
I’m on a 4090. Most everything I play regularly is chilling at 144Hz. Most if not every single setting cranked. The games that actually struggle to achieve these levels of performance are not what people spend most of their time playing. The only game to do that for me was Jedi: Survivor. It necessitated DLSS and Frame gen. Without it, it was still running at 80fps.
cbizzle31@reddit
Yeah I'm on a 3090 and most everything I play is pegged at my 116fps limit I set.
The one or two triple a games I play a year, I spend 10 minutes optimizing settings and I'm easily averaging 60 fps.
4090 is a god damn monster and can easily be considered overkill if your target framerate is 60 for all games even at 4k, and that's a generation behind.
PoopReddditConverter@reddit
I hate the 4k rhetoric on this sub (I also hate calling 1440p 2k but we won’t go there). Most of these gamers don’t even hardware that enables them to experience it. But W fellow 4k gamer. 🤘
cbizzle31@reddit
How does it prove you're point? I thought your point was that a 4070ti couldn't play 4k without a crazy overclock or unstable game play?
It absolutely can. Turn down some graphics settings from ultra to high, which for most things you can't tell the difference, and you're good.
On top of that this is nothing new, running the highest graphics has always been a moving target. Most triple a/graphically intense games released at any point in time couldn't be played with stable frame rates with the current "best" hardware.
C_umputer@reddit
Idk what on earth are you trying to communicate. Neither 4070 nor 3090 can handle 4k native at high settings without some compromise
cbizzle31@reddit
I'm trying to communicate with you that what you said in the following post is false:
"I honestly find it hard to believe 4070 ti super can handle 4k, or that it's close to 4080, unless the guy is running some insane OC with barely stable performance."
A 4070 ti, without a crazy overclock, can infact play 90% of games at 4k without any concessions and it can play triple a games with minor concessions with stable performance.
C_umputer@reddit
Please do show us what minor concessions are you talking about. Cyberpunk at max settings will bring 4070 to it's knees
cbizzle31@reddit
Easy don't play with path tracing.
C_umputer@reddit
4070 ti super, in Cyberpunk 2077, at 4k max settings, no dlss, no fg and no rt, deliveres around 30-40 fps (12 with path tracing).
Which is exactly what I have been saying since the first comment.
cbizzle31@reddit
Don't see how that's true cause I just ran the bench mark on my computer with those exact settings and got 45-55 fps.
Even if it were true I just set everything to high instead of a mixture high/ultra (minor concession) I get 60-70
With dlss quality (minor concession) I get 85-95.
Keep in mind this is a 3090 computer which is worse than a 4070 ti.
C_umputer@reddit
3090 is very close to 4070, and honestly I don't see how you're getting 45-55 fps, unless you turned down some settings
cbizzle31@reddit
Regardless when I do set everything to high and turn on dlss I'm getting well above 60fps and literally cannot tell the difference in image quality.
The 4070 is a very, very viable card at 4k
C_umputer@reddit
bs you're getting 60 fps, you must not have all the settings at max
itchy118@reddit
Did you miss the part where he repeatedly said everything is set on high (not ultra)?
C_umputer@reddit
Did you miss the part where he was repeatedly wrong?
Cyberpunk 2077 - 4k native, max settings, no dlss, no fg, no rt, on 4070 ti super, runs at 30-40 fps.
https://youtu.be/idpdzV5hCTI at 5:30 https://youtu.be/hBZcDhV7o4c at 11:47 https://youtu.be/oCY_5-rmXKM at the start
Do your own research, don't just believe what others say, especially on reddit.
itchy118@reddit
But he wasn't talking about running it on max settings, he was talking about high settings...
C_umputer@reddit
I started talking about maximum possible settings without any rt, idk what he means by "high" settings.
itchy118@reddit
He means the graphics settings that you can change between low, medium, high and ultra.
High settings does not mean he is using maxed out settings, it means he is picking the option that is called "high".
C_umputer@reddit
You must be new to gaming, the best way to get performance is to get a mixture of optimized settings. As for the current discussion, "max" which is short for maximum, refers to every visual option being set to the highest setting.
And before you misunderstand again, I am not saying this is how the games must be played. It has been proven that the difference is very hard to spot. The reason why I am talking about max settings is purely for benchmarking, so we can properly compare the performance of different GPUs.
Saying I can run a game on "high settings" means nothing, since you can change a few options and still call it the same thing. And some options don't even have "high", they just stop at medium or are just on/off switch.
itchy118@reddit
He wasn't trying to brag that he can run a game on high settings, he was telling you what settings to use on a specific game to match his experience. That's the opposite of meaning nothing.
C_umputer@reddit
I never said he was bragging. He was saying he could get 45-55 fps on the max settings using 4070 ti super, which I believe we have proved is incorrect. Matching his experience is not difficult, there are quite a few ways to do it.
cbizzle31@reddit
At least he was right about the last thing he said.
itchy118@reddit
Lol. True, that is generally good advice.
bepbepimmashep@reddit
“4K is fine as long as you don’t run at 4K”
Nice
Zoopa8@reddit
The thing is that native 4K arguably looks worse than an upscaled DLSS Quality version of it. That's why it's actually not a silly statement; you always want to enable it. Only if you go down to Balanced you may actually have a worse visual experience, but even then, it may still very well look better than native 1440p and gives a massive boost in performance.
Acuariius@reddit
Lol impossible, native will always look better, but nice try Nvidia
f1rstx@reddit
Yea, RDR2 with DLSS is so much better conpared to natuve TAA garbage. But nice try AMD
FunCalligrapher3979@reddit
native has been ruined by taa
bepbepimmashep@reddit
It legitimately has, I wonder if it’s intentional to push this tech.
Zoopa8@reddit
It doesn't though?
https://youtu.be/O5B_dqi_Syc?t=893
Seems like it depends on the game.
It’s still definitely a no-brainer to always enable DLSS though.
And responses like these definitely don't make a whole lot of sense:
“4K is fine as long as you don’t run at 4K”
Nice
The video is also over 2 years old, and it was a 50/50 split between native vs DLSS, with DLSS 2, we're currently at DLSS 4 on the 50 series, so I would definitely not be surprised if most games currently actually look better with DLSS than with a native render.
geeiamback@reddit
Care to elaborate? I haven't heard that before.
beirch@reddit
He's drunk. Native looks better than upscaled. What he probably means is that quality mode DLSS or FSR4 looks better than native TLAA/TSR or something similar.
Zoopa8@reddit
https://youtu.be/O5B_dqi_Syc?t=893
https://youtu.be/zm44UVR4S9g?t=16
It's because of video's like these that I said it, but it has been a while, and now that I've partially watched some of them again, it seems like it depends on the game.
bepbepimmashep@reddit
I kinda get what you mean but when I read some of the other replies, I think you’re misunderstanding what we mean. TAA genuinely has muddied the look of a lot of games these days. Monster Hunter Wilds vs World is crazy different because of that. Pulling FXAA out as an option has been an absolute travesty in this industry. Running FSR and DLSS look loads better, but not because it’s better than native. It’s because it takes the softness of TAA out of the equation.
I’ve played through spider-man 2 a ton lately on my setup and I do run FSR for AA, which looks fantastic. It looks very odd and upscaled when I run even “native” quality on FSR though. The same goes for most games. I’ve yet to find any game where DLSS or FSR looks as good in motion as a native render. It also is still misleading to say that it counts as running at 4K, because it isn’t.
Now that I’m thinking about it, I wonder if this push for TAA only is almost a way to force us to use these upscalers in some round-about way.
Zoopa8@reddit
I never said it's running at 4K while using DLSS, FSR, or XeSS.
How is it misleading to say that it counts as running at 4K? OP asked if it's viable to game on 4K displays, not if it's viable to render games in 4K.
If the upscaled DLSS image looks better than native rendering, then what's the problem? That's all that matters when you're asking a question like that. If you look at Hardware Unboxed's video, it was a 50/50 split between DLSS and native, and that was back in 2023 with DLSS 2. We're currently on DLSS 4 with the 50 series, and it has improved considerably. It's definitely not some wild take to say that most games look better using DLSS than when rendered natively these days.
I'm surprised by the number of people who either disagree or seem clueless. You can look it up and see for yourselves.
It's hard to believe you honestly think the trees in this video look better rendered natively than with DLSS upscaling. Look at the graffiti in the middle or the tree above the roof. It definitely looks way more pixelated/worse natively.
bepbepimmashep@reddit
You’re using a video as reference, which is valid for still images but as soon as you move these upscaling solutions turn to mush. You think we’re clueless but we are the ones running 4K displays right now and virtually nobody thinks native is worse in any way visually.
Zoopa8@reddit
That's a valid point, things may change while you're moving around, but it's definitely not true that virtually nobody thinks native is worse in any way visually, the Hardware Unboxed video I showed you proofs as much, for them it was a 50/50 which ones looked better, and I'm pretty sure this wasn't just with still images but with actual movement/gameplay involved.
I've got a 4K LG G1 OLED myself.
PsyOmega@reddit
They do. Even games where you can do native without TAA.
Zoopa8@reddit
Glad someone else chimes in to reinforce my statement.
PoopReddditConverter@reddit
I highly disagree. Been gaming 4k144 for over a year and only enable DLSS when I have to. Most everything I play regularly I can tell between native and DLSS. Depends on the game of course but sometimes even quality looks worse than native.
Zoopa8@reddit
“Sometimes even Quality looks worse than native.”
So you're saying it usually doesn't and actually agree with me?
I'm obviously only referring to DLSS Quality mode, not Balanced or Performance, although with DLSS 4 I wouldn’t be surprised if even Balanced is now trading blows with native rendering.
Have you seen this video that’s over two years old, showing DLSS 2? https://www.youtube.com/watch?t=893&v=O5B_dqi_Syc&feature=youtu.be
PoopReddditConverter@reddit
Sloppy wording on my part. With my lived experience, I’d be inclined to disagree with the premise. When I toggle on DLSS even on quality, I perceive the image quality as of a lower level. I’ll have to check the vid out though and do some comparisons myself.
zouxlol@reddit
DLSS handles pixilation at a distance incredibly well, or any other fine details you would normally need TAA to fix. Considering the rest of the image is generally indistinguishable from native it is a pure benefit from this standpoint
This was not true for DLSS pre-4.0 in all fairness
Zoopa8@reddit
No worries, I couldn't find much and this is just some random video, but if you look at the trees here for example, to me both DLSS 3 and 4 look considerably better.
https://youtu.be/CDHEXNglRzo?t=273
Zoopa8@reddit
https://youtu.be/O5B_dqi_Syc?t=893
https://youtu.be/zm44UVR4S9g?t=16
It's because of video's like these that I said it, but it has been a while, and now that I've partially watched some of them again, it seems like it depends on the game.
zzzxxx0110@reddit
And up to how many millimeters long is your ghosting trail with DLSS running in Ultra Performance mode just to make literally anything playable at all in 4k? Lol
muh-soggy-knee@reddit
Then you aren't playing at 4k are you?
I mean I'm not saying don't use it; but it's not a particularly useful metric to say "4k is fine because I can run fake 4k"
As for OPs question - The other poster is right, true 4k requires a relatively higher point in the GPU stack than it did a few years ago due to poorly optimised/heavy workload recent games.
beirch@reddit
You're right, it's not true 4K, but it's pretty damn close. You'd know if you tried it yourself. And upscaled 4K (yes even performance mode) actually looks better than 1440p.
That's why a lot of people are saying 4K is valid even without a 4090 or 5090.
muh-soggy-knee@reddit
I have tried it. I use it. Actually I arguably use FSR far more but that's only because the game I play most has FSR but not DLSS. But either way I don't kid myself that my machine is a 4k powerhouse.
DLSS is many functional ways more of a monitor technology than a graphics technology. It allows you to run a game at 1440p on a 4k display without the softness. It's a good technology. I'm glad it exists.
I'd rather we actually had decent progress in the hardware so that; 12 years after 4k entered the consumer market; consumer market level GPUs can actually run games on it at decent stable frame rate.
zouxlol@reddit
You need to blame the developers more than the hardware. They're the ones setting the requirements. Many developers i.e. Riot/Valve are very keen on making their games run on anything in the last decade
Besides that Nvidia/AMD are both held back by the same constraints - their card improvements follow the nm process improvements. Adding their own software improvements on top of it is one of the best things they can do while hardware manufacturing makes it's own independent progress
muh-soggy-knee@reddit
Yeah that's a fair point; and I do think that requirements have outpaced visible improvement from those requirements. It feels like optimisation is very poor these days. For example I look at say Starfield and feel it's visuals are fine, but I've seen much better; run much faster; on much weaker hardware.
I also think it's probably right that I reflect on what "running well" looks like compared to 15 years ago.
My 4070Ti is capable of exceeding 60fps at native 4k ultra in the most graphically intensive game in my recent history (A Plague tale: Requiem) but my idea of "running well" is not 60fps any more. Because both my monitor and TV are high refresh rate/VRR. So I want 120fps and I have to concede that's a me thing.
beirch@reddit
So would I, but Moore's law is sorta dead, and hardware just isn't keeping up with software at the same rate.
Also, you have to remember that when full HD was new, a lot of hardware had issues running AAA games at 60+ fps. Same with 1440p and ultrawide. It's always been like that.
Zoopa8@reddit
It is fine, because the upscaled DLSS version (at least on Quality) arguably actually looks better than a native render.
rainbowclownpenis69@reddit
DLSS at 4k is just upscaled 2k, kinda… right? Fake frames and scaling are cool and all, but playing at 2k without that stuff feels pretty good to me.
Source: 4080 + 7800X3D with 2k and 4k monitor.
DBshaggins@reddit
I have the same combo. What kind of fps are you getting on AAA games at 4k? On max or close settings
CadencyAMG@reddit
DLSS at native 4K always looked so much better than native 2K without DLSS in my side by side testing though. Like even pre-transformer model DLSS looked better than native 1440p when comparing 32in 4K vs 27in 1440p.
The reason why I even finalized on 32in 4K was when I realized I could literally net the same or more performance using DLSS at 4K with better picture quality and more screen real estate on a 4090. The pros far outweighed the cons there. That being said if you use 4K you should expect to use DLSS in most present day AAA use cases.
SirVanyel@reddit
At what monitor size? Your phone could be a 4k and 2k and you literally wouldn't know the difference because of the pixel size.
If you're running a 24 or even a 27 inch monitor, 2k to 4k is hardly an increase. And before you suggest getting an even bigger monitor, both shitty frame rates and a monitor thats too large can negatively impact your gaming experience.
CadencyAMG@reddit
I actually think most people would 100% notice a drop in resolution from 4K to 2K on their phone given the much closer viewing distance which plays a huge factor. Smaller mobile displays have higher resolutions for a reason.
In my comparison it was 27 in. 1440p to 32in. 2160p which is a large and noticeable jump in PPI.
SirVanyel@reddit
Smaller displays have high resolution because it's one of the cheapest features you can throw on a device and it takes up basically zero extra space. Also typing this to you right now my phone is over a foot away from my face, ain't no way I'm seeing pixels from here.
KekeBl@reddit
4K DLSS will look noticeably better than native 1440p. This isn't a personal opinion thing, it just looks objectively superior lol.
rainbowclownpenis69@reddit
4k without all that shit looks better than with it. Not an opinion. Literally superior. I don’t feel like 4k is enough of a visual improvement to ignore the performance hit, even with those features enabled.
You do you, though.
KekeBl@reddit
But you were not talking about 4k vs 4K DLSS, you were talking about DLSS at 4k vs 2k (1440p). And 4K DLSS looks objectively superior to 1440p.
rainbowclownpenis69@reddit
No. I never said it looks better. I said it FEELS better. It’s cool, though. To me the graphical clarity is negligible depending on the game, as well.
TheTomato2@reddit
Lmao
scylk2@reddit
Which sizes are your monitors?
And you prefer the native 1440p rather than upscaled/fg 4k?
rainbowclownpenis69@reddit
27 and 32.
I like it raw. Skip the software foolishness. Just my preference.
scylk2@reddit
Is it just some kind of mental block or do you actually feel like dlss does not look good? Because virtually every feedback I've read says dlss on 4K is awesome especially with the new transformer model.
And since you have both size and rez, how do you decide which game you play on what?
rainbowclownpenis69@reddit
I play Madden, 2k, Forza and things like that on the 4k. Shooters and MMO/RPGs on the 2k. The raw performance without faking frames or smearing an image not only looks better, but performs better - FOR ME.
I have been around long enough to watch DLSS improve quite a bit. It is miles better than it was when it was introduced. I think a lot of people are either unable to run the games without the software foolishness or just haven’t seen what it looks like without it.
Playing unoptimized messes and having to enable these features to get reasonable performance is unfortunate, but shouldn’t be the standard.
beirch@reddit
It is, but I still think upscaled 4K looks better than native 1440p. Upscaling in general just looks better at 4K: There are fewer artifacts and less ghosting.
Quality mode is 1440p upscaled to 4K, but somehow with AI magic it's like a better 1440p. It's honestly very close to native 4K. Even performance mode looks great, especially with a quality OLED monitor or TV.
Bloodwalker09@reddit
I play on my 4K OLED TV (77 Inch) from time to time and 4K DLSS balance and especially quality looks like native 4K from a normal viewing distance. I mean I sit about 2 1/2 meters away from my tv and it’s pretty fucking good. Played Silent Hill 2 Remake that way.
I don't feel like constantly turning my pc on and off and lugging it back and forth between the living room and the office. But when I do, 4K DLSS (without frame gen) is absolutely perfect.
Even the quality level doesn't look as good on a 1440p OLED monitor, but I'm sitting at a desk and therefore much closer to it.
BlazingSpaceGhost@reddit
Have you considered using moonlight/sunshine to stream your PC to your TV? For 120fps you would probably have to use a low powered PC but for 60fps an Nvidia shield is fine. I have a shield in my theater room and enjoy single player games and local multiplayer on my 4k projector. With a good network it basically plays the same as native. Sure beats dragging my PC to the theater room for some dokapon or Mario party.
beirch@reddit
You're closer, but the pixel density is also much higher. I also sit at about 2,5m, but I have a 65" TV. I still think it looks miles better than a 27" 1440p monitor at ~1m though. Even with lower settings and more aggressive upscaling.
Granted, my TV is an OLED and my PC monitor is VA, so it's hardly a fair fight. But even so, the 1440p monitor actually looks grainy in comparison.
_asciimov@reddit
Shh, your gonna ruin their vibe. /s
AShamAndALie@reddit
4k DLSS Quality looks WAY WAY WAY better than 2k native.
ReplyingToDumbShit@reddit
And if you’re using DLSS, then what’s even the point of going to 4K.
Might as well just play 1440 and have a TV or something upscale it.
DLSS is an excuse and a bandaid to be able to do “4K”
What’s the point
MoukhlisIsmail@reddit
Stupid ass statement
doomsdaymelody@reddit
4k is totally fine as long as your aren't rendering 4k is probably the most 2025 statement ever.
AHrubik@reddit
Along with "Upscaled 4K looks better than native 2K". Mate is trying to hard at brand loyalty.
KekeBl@reddit
But it does, at least with DLSS and FSR4. You can test this yourself if you have a 4K monitor. I don't understand why we have to deny reality just because the technology comes from a brand.
AHrubik@reddit
It quite literally can't. You can't make something from nothing.
Scaling and frame gen are objectively worse than native raster. It's adding pixels to a rendered frame that didn't exist before. It is possible that the scaling and frame gen are getting close enough to the native render that "you" can't tell the difference but it can be seen with close enough examination. Reviewers have been critiquing it for 4 generations now and it's stupid easy to spot the degradation under carefully examination. It has been getting better over those 4 generations but it's still not 100% and likely never will be.
Maybe you're mistaking quality for playability? Playing a game native raster doesn't necessarily mean it will have a playable framerate and that by using scaling and frame gen you get a better experience at the cost of some degraded rendering. That does not make the image better it just makes the experience better.
Raunien@reddit
I don't know why you're being downvoted. Best case scenario upscaled images look identical to ones rendered natively. They literally cannot look better, and in any real scenario they will look worse because there will always be artifacts. Maybe it's not noticeable to the typical user in a quickly-changing scene, and that's probably good enough, but "good enough" or even "indistinguishable" is not "better".
AHrubik@reddit
People have lost the plot here defending machine learning manufactured pixels over native raster to satisfy their egos.
They've created this very specific scenario, probably about a non native multiple of their native resolution being used, (aka 1440p on a 4K screen) which may or may not cause some artifacting (depends on the monitor) to justify using upscaling to fix it. Then they come on the internet and use that very specific scenario to justify an assertion that DLSS is better than native. It's just bollocks all the way down.
zouxlol@reddit
He's being downvoted because the conversation is about 1440p vs upscaled 4k, not native 4k vs upscaled 4k
KekeBl@reddit
Would you like to see some comparisons that prove that it 4K DLSS can look better than 1440p, and usually does? If you're about to complain about compression, I can upload the raw files of comparisons too.
AHrubik@reddit
I don't need too. I've watched all the critiques for 6 years and I know how to spot the degradation in rendering. Like I said before just because the experience is better doesn't mean the image is.
KekeBl@reddit
Just looked at your link, it's comparisons of 4k upscaling vs 4k native. That isn't what we were discussing, we were discussing 4k upscaling vs 1440p. Are you trying to move the goalpost thinking it won't be noticed?
Again, I can upload all kinds of comparisons for what we were discussing - 4K DLSS vs 1440p. I know why you don't want me to do that though, because it'll ruin your argument. I think I'll upload the comparisons t anyway when I get home from work, just so other people can see for themselves.
Gastronomicus@reddit
What FPS? Even with DLSS and FG I get 100-130 FPS with my 4070 Ti when using pathtracing on at 3440x1440. And that's with optimised image quality settings. Which sounds good, but with FG anything below 100 gets kinda choppy because it has a input lag closer to the raw FPS without FG.
Tamedkoala@reddit
Same here with the same card. DLSS Quality is normally enough to get me to 90-120fps which is my target at 4k. CP2077 and AW2 require DLSS performance to hit that, but DLSS 4 looks night and day better now so I find it worthwhile to take the expensive eye candy now. Back with DLSS 3, no way was the degradation worth taking it down to performance to have blurry eye candy.
Ryan32501@reddit
If you heavily using DLSS you are not playing at 4K. That sounds like a console slogan. "The new PS5 at 4K 120hz!" While you are pretty much at 1080p with the heavy use of upscaling. Kinda defeats the purpose imho
scylk2@reddit
Awful take lmao
Ryan32501@reddit
How? That's literally the truth of the matter. You can't say your are gaming at 4k when you are in fact not gaming at 4k? Does that only make sense to me?
Ryan32501@reddit
Why the fuck was i down voted? I spoke the literal truth. You can't say your gaming at 4k when you are not gaming at 4k. Are people really this smooth brained over here?
beirch@reddit
The difference is that a PS5 upscales anywhere from 720p-1440p with medium settings and 30fps. That's a completely different experience than high settings, quality upscaling and 60+ fps, which a lot of GPUs can do.
Once you reach those kinds of settings and performance, then it's very close to the native 4K experience. You'd know why people think upscaled 4K is still 4K if you actually tried it. Upscaling just looks 10x better at 4K than 1440p or 1080p, so it's not the crutch you think it is. It's pretty damn close to native, even on performance mode with a quality OLED monitor or TV.
No_Salamander_6768@reddit
Ahh yes. Use the disgusting upscalers that were originally for lower end cards and make your games look blurry because you paying over a $1000 for your gpu and then having to do that totally makes sense.
ONE_PUMP_ONE_CREAM@reddit
Yeah but you should be required to use DLSS for decent frame rates and it makes your CPU work overtime.
bluezenither@reddit
sooo basically 720p upscaled ai frames gaming?
JumpyDaikon@reddit
Yeah, it works with fake resolution and fake frames. But real 4k gaming is not happening.
ulixForReal@reddit
May work on older games. Forza Horizon 4 & 5 should run great in native 4k, they're even well optimized for AMD cards.
elAhmo@reddit
Which means it is not fine
FullyStacked92@reddit
"4k is fine as long as you absolutely do not run the game at 4k and instead run it at a lower resolution and fake a higher one"
InsertFloppy11@reddit
Fram gen and good experience in the same sentence? What?
Wander715@reddit
Don't listen to reddit when it comes to frame gen. I swear so many people on here have never actually tried it and hate on it anyway. Notice how it was the absolute worst thing in the GPU market when it was an Nvidia exclusive feature but now everyone has accepted it since it's available with FSR?
As long as your base frame rate is 60+ it's a good experience imo. I've even used it with base framerate down around 50 and with Reflex the latency isn't too noticeable.
Prestigious-Walk-233@reddit
This good base frames and anything over 1440p if you choose to upscales will look good 1800p preferably
fatalrip@reddit
Agreed, that’s like black magic.
Ben_Kenobi_@reddit
Frame Gen absolutely shocked me at how good it actually works. I assumed it was all a bunch of marketing bs before I tried it. The frames "feel" real, and in most of the games I've played, in real time, they look real too.
Yeah, if you play Sherlock Holmes instead of the game or if it's poorly implemented, you can find weird shit, but agreed. It's black magic that works surprisingly grest.
BigShotBosh@reddit
Mfg is amazing if you have a high enough base framerate.
Kinda weird to see the “frame gen bad” meme still going
ViperAz@reddit
framegen is good when your base framerate is high enough.
Flutterpiewow@reddit
With nvidia, absolutely
Prestigious-Walk-233@reddit
What he said frame gen is. Pm necessary I'm running a 7900 gre with a 7800x3d oblivion remaster max settings average Abt 80 to 140 fps depending on area, in the gears reloaded beta max settings easy 180 fps without frame gen btw... So it's definitely game by game whether it will need it.
Snakekilla54@reddit
How are your hotspots? Mine can’t be OC cause itlll reach the hotspot temp limit of 88 real quick and this is even after an RMA
SimpleMaintenance433@reddit
RLSS and frame gen 😄
So not 4k then.
Ben_Kenobi_@reddit
Agreed. I know not everyone has that type of hardware, but that's how pc gaming always worked. I remember when I was younger, just being happy, the new game I bought ran ran at any setting. social media culture also wasn't there to push the "need" for upgrades, so it was all whatever.
Also, resolution is so game dependent. You can play a lot of indie games at 4k comfortably on a lot of hardware.
skittle-brau@reddit
> Now it's basically unfeasible since the performance of new games are deteriorating 100x faster than gpu upgrading.
That's my secret, I almost exclusively play single player games that are 5+ years old. It's a win for me because the games are cheaper and I can actually run them in 4K at high frame rates.
The few current multiplayer games I play aren't graphically intensive.
Late-Button-6559@reddit
Not quite.
I remember the 2080ti being THE 4K card.
Then the 3090, then the 3090ti.
The 4090 finally did become IT.
Ignoring downscaling and fake frames, the 5090 is now IT.
Even a 4090 is no longer enough for “true” 4K, max settings gaming.
I’m basing each card on the games that were current at release.
How sad :(
PoopReddditConverter@reddit
Why is it always people who don’t game in 4k chiming in. The lived experience of most gamers with a flagship card is NOT using dlss. Most everything runs fine without it. The number of games where upscaling is required for 120+ fps is in the single digits.
Late-Button-6559@reddit
What do you mean “people who don’t game in 4K”? Do you mean me? Because I exclusively do, and have owned all the cards I’ve mentioned.
PoopReddditConverter@reddit
Sorry but you came across that way to me. Overall, my disdain is more about the general theme of the rhetoric every time 4K gets brought up. But to say a 4090 is no longer enough for true 4k is asinine. Save for like a handful of the newest games. There have always been games that came out that the newest cards couldn’t run. We just got reliable 4k144Hz with the 4090 along with real time ray tracing. Now mfs talking about photorealistic path tracing and how frame gen and dlss is not enough for current cards to run it 😭😭😭 hello!??? The bar couldn’t be any higher. In ten years we’ll be talking about how our fusion powered graphics cards can’t even simulate quantum physics in realtime. The bar just keeps moving.
XediDC@reddit
Yeah... I was running my 1080 on a 2K quad array since it came out ~9 years ago, and 4K up to 144 for the past few years.
I'm not into most of the bleeding edge games, but it runs say, 'ol PUBG at >100, Cyberpunk is meh middling settings around 45 but I still prefer it at 4K. BG3 is great. Pinball FX3 is awesome locked at 144@4K...and the refresh rate makes a difference more in that game than in most more recent fps stuff. Even Wukong was...playable...with some careful setting.
*before the naysayers say it won't support dual+ 4K @ 144 again...nvidia (shockingly) released a firmware update in 2023 to enable DisplayPort 1.3/1.4 on the 9/10 series.
But I was all 2K in 2014. And before that was 1600x1200...which I still miss, but I've kept one Samsung 204B alive in it's honor. 1080p is truly ancient, ditched that 19 years ago...might as well play on your phone.
PoopReddditConverter@reddit
I absolutely believe you, I was playing cod mw2019 and pubg in 4k60 on my overclocked 1070Ti. Granted, mostly lowest sometimes medium settings. But native 4k has gotten monumentally easier to render over time, save for the handful of games that nearly require upscaling. Saying anything otherwise is asinine.
pdz85@reddit
I do just fine at native 4k with my 4090.
dorting@reddit
Upscaling and Frame generation are there to make 4k gaming possible
Late-Button-6559@reddit
Not my tempo.
I mean 4K gaming was possible on the top end gear for the last few generations - at a sensible price.
This gen though (games and gpu) it’s become stupid prices AND crap game optimisation, for 4K.
Dlss and framegen should just be to allow a 70ti and above to hit ‘fancy’ higher frame rates. Their baseline should still be around the 60 mark (excepting path tracing).
5090 should be a 4K/120 card (at native res, maxed out).
awr90@reddit
4k 60-90 fps was possible with a 1080ti in 2017. I played division 1, 2, red dead redemption 2 etc all at 4k and they look just as good as brand new games. Somebody needs to explain why now a 4090 is minimum for 4k anything native.
XediDC@reddit
Pinball FX3 runs great at 4K 144 on a 1080 too. PUBG gets 100-120.
I haven't been in inspired to upgrade yet. And I left 1080p in the dust two decades ago.
GhostWokiee@reddit
It would be possible if everyone and their mother wasn’t using those as a crutch
Clean-Luck6428@reddit
Reddit has a political agenda against dlss and will downplay the experience on mid end cards because they genuinely think they can get developers to rely less on dlss by going on strike or something.
4070 super and above is adequate for 4k gaming
AisMyName@reddit
4K beautiful for me so far. I mean I don't get 200fps, but anywhere from like 90-110 feels smooth. i9-14900k, 4090, 4k 240hz.
scylk2@reddit
What monitor size you play at? If you upgraded from 1440p, would you say it was worth it for games?
AisMyName@reddit
32” curved Alienware oled I had the same size and shape monitor in 1440p before from Samsung.
I love the look of oled. And 4k is so sharp. With a 4090 it’s smooth. I suspect worse cards can’t drive it always in like cyberpunk 2077 and the like. It is great though for me
pdz85@reddit
I have a very similar setup! 32" MSI MAG321US OLED with a 14700k and 4090. Very happy I made the jump to an OLED vs 4k IPS.
AisMyName@reddit
I looked at that same monitor and almost pulled the trigger. Only did the Alienware one via DELL.com cuz at the time it was a Slickdeals thing, and all these codes, buy it with Chase get like additional 10% off, etc. etc. I forget what I paid, but it saved me I think a couple hundy going Alienware.
Yeah OLED is so beautiful.
PsychoActive408@reddit
Hey we have the exact same setup!
tamarockstar@reddit
Is that mostly due to using Unreal Engine 5 and not optimizing for that engine? I watched a video about that recently.
aa_conchobar@reddit
I don't think it has gotten much worse than it always was. There has always been that trend. I remember in 2013, I bought the gtx titan (6gb), which was the best card you could get back then. But even still, games that came out that same year and the year after would struggle on ultra at just 1080p. I remember playing total war 2 and having problems with unit sizes at ultra 1080p
Probably the most irritating thing about GPUs back then was that games were getting more complex than GPUs could handle. Today, I don't see that problem anywhere near as much, but theres a new problem: fucking prices.
PoopReddditConverter@reddit
Staunch 4k gamer here. 100% feel the opposite.
Before the 3090Ti, 4k gaming was… yucky. (Although, I was playing certain games in native 4k60 on my 1070Ti OC’d on medium settings fairly consistently)
90+ consistent fps in native was easily achievable after its launch. Of course, it was a terrible fucking value (please buy my 3090Ti) and I’m sure several electrical substations across the globe burned down because of it.
But with the release of the 4090, playing native 4k games at SOLID 144fps was a given, granted we’re not talking about a few of the newest games. The rest of the 40-series made casual 4k60 for enthusiasts totally viable.
For the 5090, there are VERY few games (the new Indiana Jones, Jedi: Survivor, CP2077) that will give you a hard time without using upscaling techniques and everything cranked.
For reference, the first game I ever actually ran into problems with on my 4090 and was forced to use frame gen on was Jedi Survivor. And only because I was getting 80-100 fps instead of the 144+ I craved. Not everyone is playing only the newest titles. And the newest titles have the best implementations of frame gen and dlss. You could make the argument that devs have gotten sloppy with optimization due to being able to rely on the use of AI imaging techniques, but it’s still counter to your point.
All around, 4k is still in its early adoption phase, and we all know early adopters pay more. The medium-tier hardware 4k experience is leaps and bounds better than it was even 5 years ago. It’s more accessible and more feasible than ever.
Auervendil@reddit
low wat, no it wasnt. 4k has not been mainstream even once. i remember buying 3090 when it came out, which was touted as an "8K" card just to be disappointed after fighting tooth an nail not to be scalped that it was like 10% better than 3080.
gpus could only keep up in terms of performance and price back when 1080p 60fps was all people care about and ultrasharp was the best mainstream monitor, and even that wasnt general knowledge. ive had friends build high end system and use some cheapo tn from 5 years back ago. now? display tech has outpaced silicon and you cant put that high res high fps genie back in the bottle, dont even start with ultrawide.
the word "optimized" back in 2015 no longer mean the same thing now. developers really just need to give up on chasing graphics so we can get steady releases of sensible games, actually make something that's fun to play and not a nightmare to code/clean
BlazingSpaceGhost@reddit
Yeah my 4080 felt amazing for 4k gaming and now it really doesn't cut the mustard. I run most games at "4k" (with dlss so not real 4k) but I've had to run a few games at 1440p to try and get a consistent frame rate.
MiguelitiRNG@reddit
this is so wrong it is actually funny
bwat47@reddit
It's not unfeasable, use DLSS quality. DLSS quality looks on par with (sometimes better than) 1440p native
Financial_Warning534@reddit
Uh with all this upscaling tech there's no reason not to have a 4k panel.
Upscaled 4K looks better than native 1440p.
danisflying527@reddit
How does this get so many upvotes?? It’s ridiculous that 500 people read this and legitimately agreed with it. Dlss4 has made 4k gaming more viable than ever…..
MathematicianFar6725@reddit
People are in here throwing an absolute tantrum. No idea what they're so upset about
DEPRzh@reddit
IDK man, I'm also shocked. Maybe people just hate UE5...
tan_phan_vt@reddit
I'm using 4k and while it is true, upscaling from 1080p is also an option.
With 4k Integer scaling is also an option too so not all is doomed.
But yea, newer games sure run horribly, only a few runs great. Doom The Dark Ages is a good example of a highly optimized game.
scylk2@reddit
Indiana Jones seems ok no?
tan_phan_vt@reddit
Oh yea that too. Idtech 7-8 all run great.
Aquaticle000@reddit
Quite possibly the most optimized game engine ever created. The older ones are a bit of a pain to play on now since they’re limited to 60 FPS with no way to raise it. Driver level frame generation can address some of this though.
Not that 60 FPS is the end of the world or anything lmfao.
Aquaticle000@reddit
Unreal Engine 5 at work.
Lightprod@reddit
It's fine for 95%+ of the games. Don't generalise for the few AAA unoptimised trash.
PrashanthDoshi@reddit
This
Psylow_@reddit
2.5k gets you 4k 60 & 4k 120 in some games
FFFan92@reddit
5080 and 9800x3D with a 4K OLED monitor. Games play in 4K great and I consistently get over 100 fps with DLSS enabled. Not sure where you are getting unfeasible from. Although I have accepted that I will likely need to upgrade my card around the 7 series time to keep up.
prince_0611@reddit
Damn i was gonna ask why 4k gaming didn’t catch on. Back when i was in high school it was the obvious future but now I’m about to graduate college and the resolution of games hasn’t changed at all pretty much.
scylk2@reddit
Melodic_Slip_3307@reddit
fr and no one puts any effort in optimization unless it's Ubisoft or Helldivers, i always found Ubi games to run well
snmnky9490@reddit
Really? I constantly see people ragging on Ubisoft specifically for being one of the worst offenders of terribly performing unoptimized games. I don't think I've ever played one of them, but I still have seen complaints for years and years about it.
Melodic_Slip_3307@reddit
Division 2 on a 7900X, 64GB and the shittiest 4080S is 80-90 FPS on 4K all max settings for instance. I believe Breakpoint ran the same or better with an older pc: iirc 5800X, 32GB and a 7900XTX (?)
trashandash@reddit
Star wars outlaws and assassins creed shadows do not run well
Melodic_Slip_3307@reddit
can't say for certain
Moscato359@reddit
Àààaaaaaaaaaaaah screaming
Sea-Experience470@reddit
The bigger the screen the better 4K is imo. On a small 27 inch screen it’s not as noticeable.
Gry20r@reddit
Better 1440p at 144fps than 4k at 60fps.
Native will always surpass upscaled bricolage. You can also let your TV upscale if you got a modern one, some upscaler like in modern LG or Samsung do a very good job.
bhm240@reddit
Always get a 4k monitor if you can. No reason not to since all the old games work well on 4k and all the new games have dlss/fsr.
BigMaclaren@reddit
anyone talking trash on 4k probably doesn't have a 4k monitor. I was playing most games on a 7900 GRE, and now with a 5080 I play any game I want and it looks so much infinitely better. I even recommend 27 inch 4k monitor because its truly the sharpest experience.,
Brisslayer333@reddit
You don't have great framerates at 4K, though. At 1440p you'd have a much smoother experience, which is more important to me personally. 1440p 240Hz feels so good, whereas 4K 90Hz feels less good.
BouncingThings@reddit
How do you deal with the text? I have windows set to big text font size and scaled to 150% but it either f ups n glitches out or just reverts back. And lots of apps don't support text scaling (like my nvr) so I'm literally looking at tiny dots for text. Firefox I can also increase font size or zoom in but that again, breaks the viewing experience nearly across the board. YouTube shorts are nigh unwatchable and clicking thumbnails 80% of the time takes me to the 'paid sponsorship' page because the link overtakes the thumbnail. I'm only about 1.5ft away from the 27" monitor, maybe 4k just needs a 32+ monitor ?
Danny__L@reddit
people who trash on 4k are usually playing games that don't need to be played at 4k
Like, I mostly play esport games, I'm not going over 1440p anytime soon and I prefer FPS over resolution.
But I get why people like 4k if they play more single-player stuff.
klaizon@reddit
What are we, ants?
KekeBl@reddit
Some people are used to sitting fairly close to their desk and don't want to turn their head left and right during gameplay, so a 30+inch screen doesn't appeal to them.
4K at 27 inches also has such insane pixel density that upscalers like DLSS and FSR4 work even better and can be used on even more aggressive settings, because the artifacts are less noticeable.
scylk2@reddit
Did you upgrade from 1440p? If so would you say it's worth the money and framerate hit?
Plane-Produce-7820@reddit
I’ve upgraded in the last 2 weeks from an Asus TUF VG27AQ 1440p to the MSI 321CURX 4K.
Was well worth it. Satisfactory is my worst performing game fps wise hitting 70fps with all settings maxed and global illumination maxed. Shadow of war 110 fps native or 60 fps when rendered at higher then 4K resolution, Bannerlord 80fps in 1000 army battles all on a 4070 super. All of these games lost 40fps but look much better (QD-Oled definitely makes a big difference as well).
Minecraft was my worst scaled performer going from 900fps to 180fps on my ryzen 5 7600.
scylk2@reddit
Are you happy with the MSI? I'm thinking getting the flat version
Plane-Produce-7820@reddit
At 32 inches I think it’s time to start going curved now that I went from 27inch flat depending on how far away you sit. I sit just over arms distance away on my setup though.
My only complaint is having to switch the panel protection notice to 16hrs (when it will do a hard shut off for a bit to protect itself from burn in) default is 4hrs which you have to cancel and can do that to a max of 4 times before it’s forced and I’d prefer a smaller square base so the stand wasn’t taking up as much realestate as they do come a bit further out then the screen.
From all of the comparing I did between QD-Oled monitors the main difference was curved vs flat, refresh rate and the adjustability of the monitor arm. Everything else was pretty much identical in-terms of colour accuracy, true black 400 etc.
I can send you some photos if you want to message me.
GladlyGone@reddit
I went from 1440p to 4k and I'm enjoying it a lot. I'm my situation, it was worth it. If you have the money to drop for it, I'd recommend it.
scylk2@reddit
thanks!
Middle-Amphibian6285@reddit
Very true, I could never go back to lower than 4k since I got my new card
Lower_Accident_7130@reddit
6950xt here. I'm a couch gamer on my 65 in 4k tv. I don't really have any issues except in oblivion remastered, but I think everyone does. Just use your preferred upscaler or turn some settings down
Effective_Baseball93@reddit
Your eye will adapt to resolution faster than you accept performance hit. I regret switching
Competitive-Cry-2193@reddit
I use a 4k 120hz tv with my 9070xt. I’m able to play some games at 4k with FSR such as Dune Awakening, Kingdom Come Deliverance 2 and some older games at native 4k like Insurgency Sandstorm or older battlefield titles
Abombasnow@reddit
When did 1440p become "2K"? If 2160p is 4K, 2K is 1080p. When did 2K, half of 4K, somehow become 75% of it?
chaosthebomb@reddit
It's a misconception due to how the resolution lines line up. 4k is named for its roughly 4000 horizontal pixels. It is also the same pixel count as 4x1080p displays. So people go oh 4k is 4x, therefore 1080p must be 1k and then 1440p must be 2k!
The problem is people forget resolutions are 2 dimensional. And an increase of 2x in 2 dimensions is actually 2x2 or 4. The 1/3 lb burger failed for a similar reason because people thought 1/3 was smaller than 1/4. The general public just sucks at math.
It also doesn't help that manufactures use this incorrect nomenclature in their marketing making the problem even worse.
Fantorangen01@reddit
The DCI spec for movie theaters use "2K" and "4K". I wonder when they started to use those terms, was it before or after 4K became a mainstream term?
DCI 2K is 2048x1080. DCI 4K is 4096x2160.
Abombasnow@reddit
What... is that awful abomination of a spec? 256:135? What is that gobbledygook? What's even using it?
Films are 1.85:1 and 2.39:1. This is why even on an ultrawide, films are going to be letterboxed, because neither of those correspond exactly to a standard aspect ratio.
Why did they make TVs and monitors and stuff with a different aspect ratio standard? I don't know. But I also don't know why we're stuck with 23.976/24 FPS still for television shows or movies. shrug
MonkeyVoices@reddit
Im pretty sure thats been the standard for filming for a very long time, and it happens to match those resolutions.
As for the TV show frequency: its agreed that it looks better filmed at 24 for most people and Im pretty sure its harder to exploit its benefits for CGI and would be much more expensive.
Abombasnow@reddit
Who agrees? It was literally a financial reason why we had 23.976/24 in the first place.
Soap operas using 30/60 FPS were always said to look a lot nicer than normal TV shows and VHSes at 59.94/60 were also always said to look really crisp.
If you get the DVDs for The Golden Girls, or other VHS shows, you can "bob" them which plays them back properly as they were on VHS, at the crisp, beautiful 59.94 FPS. This leads to it looking far nicer than any other non-VHS DVD show because the motion is just so crisp and smooth.
24 is just... why? It's stupid.
Fun fact: the .06 off (59.94) or .024 off (23.976) was because of color taking up a small amount of the playback space on those formats.
CGI would look a lot nicer not having to be slowed down to such pitiful frame rates, especially since CGI is usually at half speed. 12 FPS... next time you watch Marvel movies, if you do anyway, notice how slow anything goes when it gets CGI heavy. 12 FPS is so bad you can count the frames.
Not a good metric as they'll always claim everything is more expensive because of Hollywood accounting.
Raunien@reddit
The main reason we still use 24 FPS for films is inertia. It was originally a combination of cost saving (film stock is expensive, and was astoundingly expensive back in the day) and ease of use (24 is highly divisible). It's the slowest speed that still looks like smooth motion, so film makers could get away with it. It's not viable for, say, video games, because they are made of a series of truly still images, but each frame of film will have a slight amount motion blur which helps to trick the brain into accepting the illusion of movement.
Abombasnow@reddit
24FPS isn't very divisible at all, though? It's also pointless on 90% of TVs since most TVs are 60 Hz (which does not divide cleanly into 24), and even if you set something like Kodi to "swift refresh rate to playback frame rate', most TVs aren't going to be able to do that. What they'll do is merely insert duplicate frames to pad for it... so it's basically new-age telecining.
120 Hz TVs exist, usually high end, but I don't think any of them legitimately go as low as 24 Hz. 30 is usually the lowest, so the same telecining trick is used.
30 FPS would've made a lot more sense to shift to because it's cleanly divisible in every single TV panel and the majority of laptops or monitors (144 Hz and other oddball overclocked refresh rates like that or 160 Hz, etc., being the rare exceptions) also support it no problem.
Raunien@reddit
24 is divisible by 2, 3, 4, 6, 8, and 12. It's a highly composite number. It's the smallest number with 8 factors (mathematically we include 1 and itself when considering factors, but that's not relevant from a practical standpoint). To say it "isn't very divisible at all" is to prove how little you understand what you're talking about. That kind of easy maths made early film editing and so on much simpler.
Back in the day, TVs ran on whatever the frequency of your electrical supply was. If your supply was 50Hz, that's what you got. If it was 60Hz, that's what you got. Although the actual displayed frames was typically half that but interlaced, to double the perceived frame rate. When converting film to the tape-based format used for storing television broadcasts, they also had to align the frame rate, and the way did this was by slightly increasing the speed of the film to 25 or 30 (exactly 1/2 the electricity frequency) and, for NTSC, duplicating frames every now and then because the jump from 24 to 30 was a little too much. You seem to be vaguely aware of the concept of telecine but I don't think you understand its (lack of) relevance to modern screens. If a screen is running at a higher refresh rate than the input it's getting, it will simply continue to display the previous frame until it gets a new one. Of course, anything made on film and released on a modern format will have already undergone the 2:3 pulldown to convert it to 30 FPS to avoid the issues that arise from 24 not being a factor of 60 (although any halfway intelligently designed screen can solve this simply by being able to buffer more than one frame).
Abombasnow@reddit
None of those are relevant when the two leading TV refresh rates were 50Hz and 60Hz, neither of which it achieves by doubling.
It was to avoid it looking slow, actually. That isn't how interlacing worked unless you meant for video tapes/kinescope which I believe worked like video tape and went up to 60FPS (59.94 color)?
Not an issue unless you... need audio synced.
oh haha you haven't seen some lazy companies have you?
I Dream of Jeannie's Blu-Ray releases had caked in interlacing because the brain-dead morons at Mill Creek (a notoriously bad outlet) forgot to detelecine it before putting it down to 23.976.
Or how lazy ANY streaming service is with any Norman Lear TV show/Golden Girls/The Nanny/etc., video tape shows that SHOULD be 59.94 FPS gets... outputted to 29.97, because these services couldn't be bothered to bob them (another procedure necessary, similar to detelecining).
VHS shows on streaming services are so ugly. It's a shame because the Hulu version of Golden Girls has WAY better coloring than the ugly yellow DVDs, although it does have an annoying blur/smoothness to it, but the frame rate is just death, it LOOKS slow.
I Love Lucy finally got Blu-Rays recently. Too bad it was done with AI upscaling from film. Why? Lazy, that's why. [This picture will keep you awake for years. Don't say I didn't warn you.]()
Fantorangen01@reddit
24 fps does a lot of work hiding imperfections. Like when you watch a high fps movie the acting just looks less convincing.
Also. 24 fps just is more cinematic. Think about how animators use lower frame rates to emphasize certain movements. Like a punch.
KingdaToro@reddit
Pretty much nothing uses the "full frame" of the DCI standard. Anything wider will use the full width but not the full height, and vice versa for anything narrower. It all comes from film scanners, which have a single row of pixels that scans film line by line. A 2K scanner has 2048 pixels, a 4K scanner has 4096.
BroderLund@reddit
It’s referred to as 17:9 aspect ratio. You see on Netflix some movies have a tiny letterbox above and below. That movie is native 17:9. Quite common.
Abombasnow@reddit
It isn't actually 17:9 either. 4080x2160 would be 17:9.
Films don't have a native resolution or aspect ratio hence why shows/movies shot on film, when the studio re-releases aren't so lazy, can be made 16:9 very easily, even if they were initially 4:3. i.e: M.A.S.H., Buffy the Vampire Slayer, Frasier 4K web only, not Blu-Ray, etc.
Of course, you can sometimes see things that weren't meant to be in shots initially (mirrors with reflections in Buffy, watches/cars/etc. that shouldn't be there in the 1950s for M.A.S.H., etc.), but still.
Fantorangen01@reddit
It's only slightly wider than 1.85:1, like it's 1.89:1 or something. Or maybe I misremembered the exact numbers? Anyways, that is the spec for the projection, so movies don't necessarily fill the screen.
WorldProtagonist@reddit
The 2K term was in use in digital cinema before 4K was a common resolution or term in any space.
I first heard the term 2K in 2007 or 2008, from someone who worked at a movie projector company. TVs we’re still in their 720p/1080i era. Computer monitors hadn’t even settled on 16x9 at the time were still often resolutions like 1024x768.
hank81@reddit
Man.... 3840x2160 = 1920x1080 * 4
That's why it's called 4K, 4 times fold more pixels than FullHD.
TheGreatBenjie@reddit
That's not how that works...at all.
AndrewH73333@reddit
1440 has almost half the pixels 4K has. It’s not that bad. 4K was already going by the wrong number.
Wet_FriedChicken@reddit
2K is the common colloquialism for 1440p.
coolgui@reddit
The terms we use for resolutions are weird. Usually 4K is actually a little less than 4K and instead should be called UHD or 2160p. But 4K became a buzzword so they call it "4K class" if you look close at the packaging.
2560x1440 is more like 2.5k, technically should be called QHD (quad hd). But most people don't.
1920x1080 is "2k class" but should be called "FHD" (full hd) but most people don't. lol
It gets even more weird using the numbers with ultra widescreen monitors. I think 3440x1440 should be named l called UWQHD but it's getting silly at that point.
dom6770@reddit
I mean, you need something to difference, and FHD, QHD, UHD and UWQHD are fine for me.
And technically, 2160p/1440p is also incorrect. Those are video resolution (p standing for progressive in contrast to i for interlaced). Displays dont use p/i, it's only a video thing.
But yes, it's so stupid when QHD gets called 2K. It makes no sense.
damnimadeanaccount@reddit
It's nice, but to me the added costs in hardware + energy and resulting extra heat makes it just not worth it.
It's also not that noticable on a 24" screen, which is the maximum (depending on distance) I can play competitive games comfortably on without having too much eye or even head movement.
For people who like the big screen immersion it's another story probably.
Certain_Garbage_lol@reddit
Don't forget the 1440p 240hz or more oled monitor upgrade... Because that's a hell of an upgrade 😌
jedimindtriks@reddit
As a 4k gamer, i just cannot go back. I cant go back to 1440p shitty text quality and awful jagged edges. I would much rather play in 4k and lower settings
PollShark_@reddit
I went from 1080p 60 hz to 1440p 144hz, i thought that was amazing. Then i got the chance to go to 4k 144hz, what i noticed is thst my frames pretty much halved. It was crazy. The details were gorgeous but the problem is thst you only notice the details the first few minutes. Then it went back to feeling likr 1440. Granted in some games the extra pixels helped but not anywhere close as much as the jump from 1080 to 1440 was. Finally i managed to find a killer deal on marketplace where someone was trading a 1440p ultrawide for a 4k monitor. And now i have that 1440p ultrawide and i will never go back to a regular sized monitor, the difference is CRAZY! So moral of the story is go with 1440p ultrawide
jedimindtriks@reddit
this fucks with my brain, going from 1080p to 1440p felt like a natural upgrade for me
But going from 1440p to 4k felt like i was in a new crisp world.
Albeit it might be related to playing games with proper 4k textures. so the difference for me was massive. Seeing text in 4k for the first time was almost jawdropping.
RemarkableAndroid@reddit
Exactly my experience. I’ve had 21:9 ultra wide and bought a dell 4K monitor. Nice and crisp but missing the ultrawide view was detrimental for me. I returned it after 2 days. I’ll stay with my ultra wide while waiting for a 21:9 2160 monitor.
Zatchillac@reddit
You don't have to wait assuming you have the funds
LG 45GX950A-B
RemarkableAndroid@reddit
Good stuff!
FFFan92@reddit
I tried a 1440p ultrawide and it was a blurry mess. The low DPI made it hard to read text.
Returned the monitor and went with a 32” 4K OLED. I am in heaven.
PollShark_@reddit
What size was it? Ultrawide wont make the resolution worse just because its wide. Itll have the same density as a regular 1440p monitor
FFFan92@reddit
34” and to me it looked very blurry. Like someone smudged the screen. The 4K monitor was a night and day difference in visual clarity.
StolenApollo@reddit
34” is definitely massive for 1440p. Personally, I think 27” is the ideal monitor size and 1440p is fantastic for that size, but I do agree that it’s definitely a noticeable difference for 32” or higher and 4K suits those well.
FrostyD7@reddit
34" ultrawide 1440p and 27" 16:9 1440p are the same dpi.
WhatIsDeism@reddit
34 is the standard inches for most 1440p ultra wides, it's the same height as a standard 27 inch ultra wide. Source, my 34 inch ultra wide doors right next to my 27 inch 1440p monitor. The are the same dpi
StolenApollo@reddit
Oh wait I assumed it was 34 at a generic 16x9 LMAO I feel very dumb I dunno why I forgot ultrawides exist when I wrote that. That makes so much more sense lmao
In that case I gotta disagree with that guy about the visual clarity. To each their own, ofc, but for me, having used 4K and 1440p monitors, the difference is not worth the performance hit in the slightest and is not that significant. Unless it’s an OLED because those often have nonstandard pixel layouts that screw with text.
PollShark_@reddit
Huh thats really strange? Did yoy only ever have that experience with that one monitor? Cause maybe you had taa wnabled at the time or something, ive never experineced that with a monitor unless it had a weird setting/a defect
Jujube-456@reddit
The details on 4k aren‘t that crazy, but I have a 28in 4k 144hz monitor and reading text on it is otherworldly. Any computer work outside of gaming is made so incredibly better by 4k, that I can‘t go back.
Typical_tablecloth@reddit
I felt the same about the fancy new OLED Asus 4k monitor I bought and returned. Everything looked amazing but after a few days I was already starting to get used to it, and I missed my frames. The bigger bummer is that you can’t just run the monitor in a lower resolution either. Turns out 1440p on a 4k monitor looks significantly worse than native 1440p.
FunCalligrapher3979@reddit
That's why you don't change the resolution you just use DLSS. DLSS at 4k using performance mode looks much better than native 1440p while performing about the same.
IntermittentCaribu@reddit
Obviously you have to drop down to 1080p from 4k to get integer scaling. Like you have to drop down to 720p on a 1440p monitor for the same.
Sh0ck__@reddit
Which is why I went towards 3440x1440p, ultra wide might be a bit of a gimmick but it brings so much more than going 4K in my opinion! The difference is much more noticeable and enjoyable
stefanlikesfood@reddit
I think an OLED 2k would look better IMO
spider623@reddit
Same shit, no real 4k native 120fps stable yet, stay at 1440p for now
GladlyGone@reddit
Since when do you need native 120 fps for a game to be enjoyable?
spider623@reddit
Since you need to match our damn high refresh rate monitors and tv to avoid tearing....
StewTheDuder@reddit
I don’t have this issue. VRR is a thing on newer/nicer displays and helps this tremendously. Or optimize your settings and lock the frame rate and use VSync if VRR is not acting right.
spider623@reddit
the issue is with VRR...
Visible-Concern-6410@reddit
I honestly don’t see a point in it right now since upscaling from a lower res is pretty much required to make current gen games run at a decent framerate with decent settings. 1440 is the sweet spot right now, and 1080P still looks great if someone wants to really get high fps.
scylk2@reddit
I've seen a lot of people say that upscaled 4k looks better than native 1440p
ansha96@reddit
Of course it does, much better.
scylk2@reddit
Is it sarcastic or is it for real?
ansha96@reddit
It may seem sarcastic only if you never played on a 4K monitor...
scylk2@reddit
yeah but upscaled 4k looking better than native 4k? how?
ansha96@reddit
Many games use terrible AA algorithms (TAA), thas why...
Visible-Concern-6410@reddit
This is definitely true. DLSS is honestly the best antialiasing I've seen, I'm surprised NVIDIA doesn't add a native resolution mode that only applies their antialiasing.
ansha96@reddit
You have it, it's called DLAA.
binge-worthy-gamer@reddit
4k is a series of compromises.
You're often not going to be able to pull off a native 4k but upscaled 4k can look fantastic with DLSS (and now with FSR4). So if you absolutely can't stand any form of upscaling then 4k60 and 9070xt are not going to mix very well.
Or you could turn down some settings from Ultra to High, maybe you'd be able to push a native 4k that way.
You're most definitely not going to be able to do any meaningfully high framerate past 60 in new titles so that's another compromise.
And so on.
I've had a 3080 for the past few years and I've been doing a mix of the above to get a 4k like image. It's amazing IMO and leagues ahead of a native 1440p, but ymmv.
ShadiestOfJeff@reddit (OP)
I was thinking on the assumption that upscaling is a necessity at this point.
Calm-Bid-8256@reddit
For 4k it pretty much is
Ouaouaron@reddit
Max settings is almost never a good idea on a game that has come out in the last 5 years. There's a reason that the Avatar game locked its max settings behind a command line argument.
ParryHooter@reddit
Well it really depends what you’re playing. I play a lot of single player games and with those it just enhances the games atmosphere and draws me into the world more. For those I’ll take the hit and play sub 60, worst case scenario in some harder games I’ll throw on DLSS for more frames for a tough boss. I personally love maxing out settings games look so incredible these days.
Ouaouaron@reddit
In a lot of games, the difference between "High" and "Max" for most of their settings is nearly impossible to spot while being incredibly taxing on your hardware. That's not always the case, but as a rule of thumb you'll do a lot better leaving everything at High unless you know what a setting is doing and why you prefer it at Max.
As general advice, at least. I'm not hell-bent on convincing you that you should act any differently.
FACE_MACSHOOTY@reddit
and thats why its bullshit
uspdd@reddit
The quality of FSR4 is insanely good, you won't really lose that much even going performance at 4k.
Dredgeon@reddit
I play on a 7800xtx and I play almost all games 4k60 and up at high settings (because they are indistinguishable from ultra and give me more stable performance. I play all kinds of new games and only raytracing really drags on it. And the new cards from AMD have closed the gap to Nvidia on that. I really enjoy playing at 4k. In fact I've never played a game below 4k ever since I built my first PC 5 years ago. I will say if you can enjoy 1440p (I have pretty sharp eyesight) you would be better off switching to OLED.
SloperzTheHog@reddit
Personally I think once you go 4k there’s no going back.
ScornedSloth@reddit
I would argue that someone who can’t stand upscaling has either not tried dlss4 and fsr4, or they don’t actually play games, but just watch testing videos on YouTube in slow motion.
binge-worthy-gamer@reddit
Maybe. I've been using DLSS for a long time and while it's amazing it can still be a bit of a mixed bag. Even DLSS 4 creates weird ghosting in some situations and it can be super obvious to some. The saving grace is that some times the native TAA implementation is worse.
awr90@reddit
I get 4k 60+ native with my 9070xt. But nobody buying these GPUs for $600 or more want to play games at 60fps anymore.
binge-worthy-gamer@reddit
You get 4k 60+ fps native in Doom TDA at Ultra Settings with Path Tracing Max?
(I know this is an insane ask from any card, but that was my point about 4k being a series of compromises and hovering around 60fps often for newer games is also one of those compromises, which I explicitly stated)
cowbutt6@reddit
Unless you join r/patientgamers and play older games on newish hardware...
binge-worthy-gamer@reddit
Of course. Even the Steam Deck can push 4k on Halo 4 (and it looks stunning)
PsyOmega@reddit
Can it, though?
I've been playing MCC on my Deck and it did alright with Reach, remaster halo 1, but halo 2 remaster? 50fps. and thats the same graphical level as 4 basically.
I'll be happy if it can lock 90 in H4 when i get to it though.
binge-worthy-gamer@reddit
Yes. It can't do 60fps at 4k, closer to 40. There's resolutions between 1440p and 4k that it can manage at 60fps.
At 4k it's mostly running out of VRAM.
RoofTopSlop@reddit
3080 still pulling its weight and more in 2025. Is yours evga? Can’t bring myself to buy a new card because evga pulled out after 3000 series
krypton1an@reddit
i love my evga 3080 not looking forward to replacing it anytime soon...
binge-worthy-gamer@reddit
FE
It's showing its age but not enough for me to be bothered. Occasionally I'll have to drop a game down to 1440p or get a bit more aggressive with scaling.
tan_phan_vt@reddit
I got a 3090 now but before that it was a 3080. It was definitely pulling its weight and has staying power for sure.
The only problem was the the amount of vram, 10GB is not good enough onwards.
Rainbowlemon@reddit
I think people have forgotten the art of tweaking game settings to get best performance per visual fidelity. No need to run your games at 4k ultra when 1440p high might look almost exactly the same and double your FPS.
Hell, I know it sounds stupid, but I've started going super low res with some games, dropping them down to a quarter of the resolution. If it's a retro/pixelated game, it really doesn't make a huge amount of difference if you use low res + integer scaling and massively cut down on the number of rendered pixels.
binge-worthy-gamer@reddit
On the right display even an upscaled 4k looks significantly sharper than 1440p. That was part of my point. I'd rather sacrifice other settings than resolution.
Rainbowlemon@reddit
Yeh, for some games I'd still prefer lowering other settings like shadows & lighting. For retro games if you use integer scaling (or nearest neighbour scaling if the game supports it) you get a nice sharp image and don't really lose anything in visual fidelity. A lot of emulators do it; you can get a higher framerate at a "lower" resolution with a crisp image.
EastvsWest@reddit
Yup, 4k just not worth it unless you want to spend 4090/5090 prices for the uncompromising experience I prefer. 3440x1440 has been my sweet spot for more than a decade.
vexir@reddit
I play MMOs on 60 FPS productivity monitors (4k and 5k2k) and I love it. Don’t need higher frame rate and the super crisp text is great!
Dutchy9225@reddit
I play games on 4k with a 6950xt. I can target 120+ fps on things like Apex Legends (which I play a lot). For most rpg's is target 60fps which is doable just not on ultra. But on newer titles I usually turn on FSR set to "quality" which helps out a lot.
3G6A5W338E@reddit
4k gaming was always possible even with weak GPUs via upscaling.
Today, both the GPUs and the upscaling got better.
I run a 4K screen with a rx 7900gre, most games I can render native, and for the rest there's FSR.
Xandril@reddit
I honestly struggle to tell the difference between 2 and 4 k.
StewTheDuder@reddit
I game half the time at 4k on a 65” OLED tv and half the time on a 1440UW OLED with a 7800x3d/7900xt system. The 7900xt has plenty of horsepower for 4k 60 fps in new titles with optimized settings. I can easily adjust a few things and strike a good balance of visuals and performance. If I need to use FSR 3.1 or XESS quality for a little boost I will. They’re acceptable to me at 4k, not so much at 1440. XESS being the better tech in most scenarios.
All that said, the 9070xt can handle 4k very well in like 99% of games. Theres 1% atm you’ll need to optimize some settings more on, but you should be doing that anyways.
zexton@reddit
would never replace my 4k screen with a 1440p screen, dlss saves the day in almost all modern games,
old games that dont scale ui properly, are most of the time designed with 1080p screens to begin with, which is perfectly scaled to 4k,
slenfir@reddit
You diggin in meee
TorinDoesMusic2665@reddit
I'm sticking to 1440p 180hz with my 9070XT. The only reason I could see for upgrading later on would be for OLED and a higher refresh rate like 240hz, which is expensive as hell
Parthosaur@reddit
I'm running a 6700 XT, it's bad. stick to 1440p and you'll still be a happy camper 2-3 years from now
Tayback_Longleg@reddit
3090 and 9800x3d here. I think you will be okay if FSR4 is picked up and is good.
I’d be upgrading this gen if I didn’t get the dlss4 upgrade and the transformer model upgrade.
cbrec@reddit
A lot of diminishing returns, planning on going back to 1440p with my 4090 maybe an ultra wide
scylk2@reddit
What monitor size you got? Not worth it compared to 1440p for U?
cbrec@reddit
32 inch 4k and a 27 1440p, only reason why I’m not using my older 1440p is because it’s much dimmer than the 4k, but the 4k’s higher fidelity really wasn’t worth the lower fps that came with it
scylk2@reddit
What about the size? Not worth it either? Are you gonna go back to 27" 1440p or 32" 1440p?
Financial_Warning534@reddit
Fail.
Dante9005@reddit
W, I went from 1440p to a 1440p OLED and that itself was a huge difference.
Unique-Client-4096@reddit
It kinda depends on the game and the hardware. There are games that a 5090 will get insane framerates even at 4K, and there are games where even a 5090 needs to use upscaling.
I think personally if i owned a 5090 or 4090 i’d probably still use DLSS because i’d rather game at 90-100+ fps instead of like 50-80.
MyStationIsAbandoned@reddit
It s not worth it unless you play on a big screen. on a 27inch monitor, it's a complete waste of money and energy.
Fantorangen01@reddit
Actually🤓 2K is 1080p.
3840x2160 is 4K 1920x1080 is 2K
2560x1440 would be 2.5K
And that's not even including the cinema versions. Those are defined as a part of the DCI spec. 4K DCI is 4096x2160. 2K DCI is 2048x1080. 90% of the movies you see in a cinema are one of those 2.
KingdaToro@reddit
Almost. The fact that 7680x4320 is officially 8K, rather than 7.5K, means the rule is to round the horizontal resolution to the nearest 1000, not the nearest 500. And that, in turn, means that 2560x1440 is 3K.
TheGreatBenjie@reddit
Not quite. DCI 8K is 8192x4320. The rule was never to round, they were always based on the DCI resolution standards. 2.5K is just a weird solution to 1440p being popular but not being related to a specific DCI resolution spec.
KingdaToro@reddit
Rounding works, though, for literally everything except 2560x1440 wrongly being called 2K rather than the correct 3K.
TheGreatBenjie@reddit
"correct"
There is no correct K designation for 1440p because it doesn't correlate to an actual DCI resolution spec like 2048x1080 or 4096x2160
Fantorangen01@reddit
Annoying. But your logic is sound
doctor-code@reddit
1080p would be 1K. 1920x1080 has 1/4 of pixels than 4K. 3840x2160 = 8,294,400 pixels 1920x1080 = 2,073,600 pixels
dom6770@reddit
any resolution with *K is actually only a cinema resolution. 4K is 4096x2160. Monitors and TVs have Ultra HD (3840x2160).
I mean, FHD/QHD/UHD is much easier to write anyway.
MasticationAddict@reddit
4K is fine but expect to need DLSS in most games unless you have a 4080 Super or better. Some games you just can't do it even with a 5090 but those games are giant outliers
We're in a bit of an awkward period. It's not that games are necessarily unoptimised - that's definitely a major contributor to the problem - but rather that we're seeing rapid evolution of AI upscalers and developers are currently relying on these as a baseline expectation. This is combined with tight timeframes
So basically? Yeah it's good, but unless you are way way in the high end, expect to be comfortable with at least some DLSS. The good news is? DLSS4 is kind of insane, the quality is quite a leap ahead
One_StreamyBoi@reddit
4K is amazing, game optimisation is dogshit
My 4070ti + 9700x get 90-110fps in most titles high-max settings with dlss balanced and a slight overclock
Nervous-Bee-8298@reddit
2k OLED>4k LCD If you have to choose pick the color accuracy
DaggerOutlaw@reddit
Whoever started this stupid trend of referring to QHD as 2K is an idiot who can’t read numbers. “2K” is 1080p.
4K - 3840x2160 - UHD 2K - 1920x1080 - FHD 2.5K - 2560x1440 - QHD
Ashari83@reddit
It's the other way around. 4k was a marketing gimmick when it was new, it should have been called 2k. Every other resolution has always been called by its vertical pixels. Ie. 720p, 1080p.
dom6770@reddit
720p/1080p are not display resolutions, it's video resolution, technically only relevant to movies, Blu-Rays whatsoever. Displays don't have interlaced/progressive.
Raunien@reddit
Progressive/Interlaced is only relevant to CRT screens. Basically, do they draw every line each time around, or draw the even lines then the odd lines. Being able to display progressive video was major leap in image fidelity at the time. Digital displays are effectively always progressive, although they're drawing their "lines" (rows of pixels) simultaneously rather than scanning an electron beam across the screen. The p was kept at the end because everyone was familiar with it. That said, in my experience digital screen manufacturers used the terms "HD" and "full HD" to refer to 720 and 1080 respectively.
The switch away from drawing horizontal lines one at a time is probably also why marketing moved away from vertical resolution to horizontal. Vertical resolution is important when it's intimately tied to how the screen works, but when you can just assemble an arbitrary rectangle of pixels you just go with the bigger number.
Xjph@reddit
"4K" was a term first used in the cinema industry to refer to digital projection resolutions of approximately 4000 pixels wide. "2K" was retroactively named after "4K" gained traction, but within the cinema space still referred to approximate horizontal resolution.
Both the 4K and 2K terms were eventually standardized by DCI, they are 4096x2160 and 2048x1080, respectively.
During this time, as you say, home theatre and TV marketing folks also decided to co-opt the term "4K" to refer to 3840x2160, using the rationale that it too was approximately 4000 horizontal pixels. I can't speak to the specific reasons for why they switched from vertical (720p, 1080p) to horizontal, but I'd wager someone thought that "2160p" didn't roll off the tongue as easily.
Unfortunately the back-naming of 2K in that space was less around what the actual numbers were and ended up just being someone noticing that "2560x1440" started with a 2 and "1440p" hadn't caught on in the common discourse the way "1080p" had. So while cinematic 4K and consumer 4K are fairly close to each other, cinematic 2K (which is an actual standard) and consumer 2K (which is now ambiguous nonsense) are not even close.
hank81@reddit
Man.... 3840x2160 = 1920x1080 * 4
That's why it's called 4K, 4 times fold more pixels than FullHD.
Fantorangen01@reddit
But.. high number = better
nartek01@reddit
I think it depends on preference, when I first saw 1440p I instantly knew what my next monitor resolution would be. But when I saw 4k I was in awe but, I wasn't prepared to throw my money at it. And this was when I bought my 1080 Ti, I'm on 4080 and I still think 1440 is THE sweetspot for performance and resolution. My criteria is as long as I can Iron Sight in games such as Tarkov, Dayz, Arma I'm content.
WheatFartze@reddit
If you can afford it, go for it. I have two displays, a 4k 240hz oled 32” and a 1440p 144hz 27” regular monitor. I’m running a 5070Ti and a 9950x. The 4k with high settings absolutely obliterates the 1440p with ultra/epic settings. The extra pixels make a huge difference, but it probably helps a ton that it’s an Oled too so the display is crystal clear. I normally target to have my games run at 140fps, and I’m a huge fan of upscaling and using frame gen to get there. I play a wide mix of games so the only time I turn my settings down for more frames is competitive shooters, but even then I’m really targeting a stable 160 fps. Idk what your monitor setup looks like but if you’re aiming for anything over 27” I’d recommend going with 4k. For example, expedition 33 for me runs at 70 fps, but with a 2x on lossless scaling it runs slightly over 100fps, and for a single player game that’s perfect. I get more frames on Cyberpunk but that’s has native frame gen and better upscaling
Pajer0king@reddit
Don t know, don t care, still using 1024p and ocassionally 1080p 😎
S0ulSauce@reddit
My opinion may not be shared by many, but I think 4K should generally be avoided in favor of 1440p unless you have a near top tier card and understand the tradeoffs. What happens is the performance impact to go to 4K is very large considering the visual differences are not monumental and FPS is going to suffer majorly. Basically it takes a lot of horsepower for little benefit (1440p looks great already).
I actually do use 4K, but there are a set of "understandings," some of which are compromises: 1. I use a decent OLED TV for both gaming and video. The TV doubles as a monitor, it looks amazing, affordable, and 2 birds with 1 stone. The downside is no dual monitors, the TV is in use when gaming, and TV only has a 120 Hz refresh rate. It's also isn't going to be quite as fast as a high end 4K monitor on latency. 2. I know that I'm not going to get 200 FPS on AAA games (TV won't even do it). I'm okay with that. As long as I can get 90 FPS or higher, I'm fine with it. FPS critical shooters aren't what I play mostly, and 90 FPS is fine for everything else to me. 3. I know I'll be using DLSS where possible. I understand that I'm not going to be maxing out settings and getting great FPS at native resolution at 4K. From having tried FSR and DLSS, I'd recommend Nvidia if you want 4K. Have had good luck with DLSS. Frame gen sucks a lot of the time TBH. I generally don't use it, but I do if it's a great implementation of it.
If I didn't have the decent TV already next to the PC, I had a weaker GPU, or I was determined to max FPS as much as possible, I would not be playing games in 4K.
techaneal@reddit
1440p oled is just too good maxing out graphics settings the visuals are 🔥
Captain_SmellyRat@reddit
I play at 4K with a RTX 4070 😎 and get 120 FPS in most games with max settings and DLSS Performance + FG.
ChampionBaby@reddit
FFXI still looks better and is a far more enjoyable game even at 1080 30fps
Newer games may seem like they look better at a glance but the game is just not as good.
zman6116@reddit
4090 and 7800X3D on M28U at 4K. Rarely am I below 60fps on anything. Typically I like the fluidity of 120+ FPS so I will turn down settings to hit that. I think 4K gaming a great personally
spawnkiller97@reddit
Your making me feel old when you say that back when the 2070 came out lol that was what 2018 2019? Honestly stuff that came out 10 years ago 4k or 2k is still useable to me. After so long though the used monitors most of the time will have issues with the backlight but other than that paying 80 bucks for a monitor that was 1400 when it came out with a few minor issues I'd take any day of the week.
frodan2348@reddit
The bigger difference will be going from TN to IPS or OLED. At 27”, 4K doesn’t do that much in terms of visual fidelity over 1440p for gaming. Not worth the performance loss imo.
Department_Complete@reddit
I wouldn't recommend anyone who uses their desktop primarily to play games to buy a 4k monitor. The performance loss is simply not worth the visual clarity. You won't notice the lower resolution while playing and won't have to use upscaling as much to hit acceptable frame rates. A very high quality oled hdr 240+hz 1440p monitor will simply give you a better gaming experience than a mid range 4k monitor for around the same price. My brother plays on a 1440p 360hz qdoled hdr monitor with a 5090 setup. The 4x frame gen makes him able to hit such around or above 360fps in most games, which he couldn't do with a 4k monitor. I personally have the same monitor but with a 7900 xtx, i've looked at the monitor next to a couple of 4k monitors that were around the same price. And genuinely found mine to look far far better than the 4k ones
Zollery@reddit
Honestly. I would say a good Oled monitor can make a big difference in quality too. I have a 2k oled monitor and it was a big step up from what I had before
JVIoneyman@reddit
Without AI upscaling I think 1440p is the way to go. 4k if you are going to use DLSS, even performance look very good most of the time.
imjustatechguy@reddit
4k would be good if you had a decent GPU and stuck mostly to single player, non-competitive, titles.
2k has been where it's been at for me for about a decade now. I've been able to run everything I want natively with no issues since I got my first 2k monitor.
Ok_Jacket_1311@reddit
I found upgrading from 1080p to 1440p rather underwhelming, so I'm not bothering with 4k, ever.
W1cH099@reddit
I play in 4K with a 4080 Super, which is the same performance as your 9070XT and everything runs fantastic
Of course you need to thinker with settings here and there, not even a 5090 can push some games without dlss and frame gen, I’m currently playing Black Myth Wukong with High settings and high ray tracing at around 100 fps with dlss and frame gen, everything looks incredible
TheGreatBenjie@reddit
2K is 1080p if you weren't aware. 4K is much clearer. 1440p is still a great middleground though.
RevolutionaryBug3640@reddit
It’s like a blind person seeing for the first time.
Zatchillac@reddit
A TN panel? Anything would be an upgrade from that. I didn't really think the difference from 1440p to 4K was as great as 1080p to 1440p. But what was extremely noticeable was going from 16:9 to 21:9 and now I can't go back down from ultrawide
El_human@reddit
You have 2 more k
onebit@reddit
Personally I went 21:9 3440x1400, which is about 60% the pixels of 4K. The 3080 can drive my monitor which is 75hz. In some games it barely holds it.
CheapCarDriver@reddit
Unfortunately impossible unless you sponsor yourself a 5090.
But I play older games on 4K and its great.
Inuakurei@reddit
It was a marketing meme 5 years ago and it’s still a marketing meme today.
Xcissors280@reddit
4k is super nice for any kind of creative work but the difference in gaming just isn’t worth the performance hit at most sizes imo
FlakyLandscape230@reddit
Lucky for me I was born with optic nerve damage and can't honestly tell the difference between 2k, 4k or HD so I don't need to upgrade anything insanely....does suck having color spectrum issues though.
TheGreatWalk@reddit
Absolute garbage. Do not go 4k.
Instead, get a higher refresh rate 1440p monitor and play at 144-240 fps. It's a much, much better experience than 4k60fps, especially for faster paced games, such as shooters. Realistically, any game where you manually control your camera (with mouse or controller joystick) feels much, much better on higher fps. Even games like expedition 33 feel better, because lower input latency and smoother frames make parrying /dodging feel better and more reliable.
You can get some seriously sweet 240hz 1440p oled monitors that are ultra low latency and improves gameplay so much. There aren't really equivalent low latency 4k monitors, even higher refresh rates tend to have higher input latency.
If you're rich, you can always get a 1080/1440p 240/360hz oled AND a good 4k144hz monitor, then decide between them, but personally I always, always went back to either 1440p or 1080p(depending on how gpu heavy the game is)to keep consistent smooth frames. It just always felt better. I ended up reselling my 4k monitor.
Desperate_Street5231@reddit
4k is fine but not worth the money you need to invest to run it at stable and fluent fps imo. At least the difference between 2k and 4k is not as big as many ppl are claiming all the time. Ofc it's noticeable if you get very close to your PC monitor and compare both settings. But while not actively focusing on the resolution, most users won't sense any difference while playing. Especially if you are running a pc-tv setup like I do.
There is no way seeing a difference in 3-4m distance between 2k and 4k in most cases.
I'm running 2k @120 for probably 6-7 years and and I don't feel the urge to upgrade to enjoy everything in 4k @120.
But if lets say my GPU or TV will give up their service, I'll setup a 4k @120hz system for sure. Maybe even with a laser beamer instead of the tv. But this will really depend on the budget and what the prices for those components and devices will be.
ro3lly@reddit
I've got a 9950x3d and 5090 and a 120hz 4k screen, most of the time, max settings give between 30-75 fps on games.
Add dlss, 45-90 fps.
Add 2x frame gen 90-135 fps.
KekeBl@reddit
If you have access to DLSS or FSR4 and at least an RTX3080/4070 or RX7800/9070, you can easily play at 4k output by using ML-powered upscalers. I get why people who tried FSR2 at 1080p aren't convinced by upscalers, but 4k is their actual intended scenario and you'll realize why when you try using them at 4k. In most modern games there's no reason to use full 4k when modern ML-powered upscalers are available, you can render twice the frames (yes, real frames) while getting the same or near-same visual quality.
I game at 4k with an RTX 4080, with DLSS ranging from 50 to 85% resolution depending on the game. This is for games that allow you to use modern upscalers. As for older games that don't have them, you should be able run those at full 4k easily anyway.
Visually, 4k output just makes everything much clearer and stable. It's hard to explain in ways most people will realize, but if you try to go back to 1080p/1440p after 4k you will feel like everything in game is very unstable, like you can visibly see the pixels shifting and shimmering, like there's a mild DOF and motion blur effect everywhere.
ReviewCreative82@reddit
for me 2k monitor is already a mistake and im just waiting until it breaks so I can come back to 1080p with clear conscience
Why? because on 1080p entire screen is always in my field of vision, but in 2k only 2/3 of the screen is and I have to turn my head
AnnieBruce@reddit
You might want to consider a bigger desk so you can sit a little further back. The keeping everything in view issue is why I'm not going to 4k yet, to really shine it needs a bigger monitor(especially to keep font sizes legible) and I'm already borderline too close to my 1440p display.
ReviewCreative82@reddit
yeah, if I were one of these people who set their monitor up like a tv screen and treat their gaming computer like a playstation, playing games on a sofa or bed while the screen is several meters away then I'd imagine 4k is a good solution
But im old fashioned, I like to sit in front of my computer and have everything fairly within the reach of my arm, so....
robotbeatrally@reddit
IT's fine if you have a 5090 and 9800x3d. Otherwise it's just a little shy. basically the first build I've ever felt hit the mark.
primaryrhyme@reddit
DLSS and upscaling in general are pretty incredible. Quality preset uses 1440p internal resolution and manages to look very close to native or even better (but this has more to do with native TAA being bad). If you are playing single player games, frame gen is good too so you have a lot of options to reach 4k high frames with little quality loss.
IMO the biggest deterrent isn’t necessarily the GPU (though I’d want at least 9070xt or equivalent) but the monitor itself. A great 4k monitor is much more expensive than a great 1440p monitor.
GiantToast@reddit
Im considering downgrading to 1440, it just seems like the sweet spot for graphical fidelity vs performance. 4k is fine but you pretty much have to use DLSS or some other upscaling if you want consistent and decent FPS.
nacari0@reddit
Im also curious about this. I remember going from 1080 to 2k was night n day in quality, ive since been stuck with 2k , also cuz from a performance perspective
Instant_Smack@reddit
Don’t do 4k
PogTuber@reddit
It's glorious for visual fidelity if that is what you like. In some cases it makes gameplay better. Such as racing games where seeing more detail in the distance can get you a better perspective on what's coming up.
Especially textures since games are using such high resolution assets really come alive in 4K with detail.
In most cases it's worth turning down some effects to get 80+ fps at 4K than to play at 120+ fps at 2k, to me.
Icy_Fold967@reddit
Just so you know, 2K is just 1080p. You're referring to 1440p.
_Bill_Huggins_@reddit
No one calls 1080p 2k...
If you say 2k people think 1440p.
sunqiller@reddit
4k was entirely worth it to me. I play on a 4k TV and I could never go back to a small screen.
Similar_Ad_7377@reddit
I have a 3080 12GB OC and 4K is going well for me. At med -high settings I can achieve 60-80 fps with no dlss. Never going back to 1080p.
TLunchFTW@reddit
In more fps we trust 1080p gamers rise up
Routine_Left@reddit
I've been 4k gaming since 2017. don't see the problem. And no, you don't need X090 to do it. Never did.
-Rivox-@reddit
So, the thing is, it's not just about resolution, but also refresh rate, size, image quality and price.
What's better, a 27" monitor o a 32" monitor? A 60Hz monitor or a 180Hz monitor? A $200 monitor o $400 one? A TN monitor or an IPS one?
That's a tough one that I'm also wondering. On one hand, I'd like a 4K monitor for the great image, on the other hand I'd also like to experience high refresh rate, all that without breaking the bank.
The 9070XT probably won't be able to consistently run newer games at 4K 60fps or 2K 120fps. That being said, older games can definitely reach that threshold, and you can use upscaling and frame gen to give you the extra oomph when needed.
So what is better, a 2K image on a 2K 27" monitor, or a 2K image upscaled to a 4K 27" monitor? Probably the second one, you get more pixels in the end. It's not as good as native 4K, but it should be better than native 2K. Then again, is it worth it versus going for a higher refresh rate monitor?
2K monitors are usually higher refresh rate, so the question might be, for a certain budget, would you rather get a 4K 60Hz 27" monitor, or a 1440p 120Hz 27" monitor? I feel like this is a very personal answer
PsychologicalGlass47@reddit
Twice the resolution, pretty damn good.
Tiny-Independent273@reddit
with upscaling, fine, depends what games you wanna play too
No-Log2504@reddit
I have a 7800x3d, 5080, and a 4K 240Hz monitor. I use DLSS and Frame Generation in basically any game that supports it. Definitely possible with your PC but you’ll be looking more at using medium-ish settings, depending on the game!
scylk2@reddit
Did you upgrade from 1440p? If so, would you say it was worth it?
1440p/4070ti, I'm considering upgrading to 4k 5080 👀
No-Log2504@reddit
I did! So I went from 1440p 240Hz VA to 4K 160Hz IPS to 4K 240Hz OLED, and each jump was an incredible upgrade. 4K 240Hz OLED is breathtaking and I absolutely would recommend it to anyone who has the budget. 4K is 100% worth the price tag in my opinion, especially if you’re considering upgrading to a 5080!
scylk2@reddit
Thanks mate, I think I'm gonna pull the trigger, life is short 😎
ShadiestOfJeff@reddit (OP)
If you switched to competitive games, are you able to pull 240 fps on say valorant?
No-Log2504@reddit
I don’t play Valorant but according to a youtube video with my specs they got like 500+ FPS at high settings so I’d say you can easily get that. Valorant is incredibly easy to run lol.
ShadiestOfJeff@reddit (OP)
Yea it's just so I have the option for high refresh rate games and games with high fidelity.
skunk42o@reddit
That's why I'm gonna go with a 4K OLED dual mode.
4K 240HZ for most games 1080p 480HZ for CS2 and Valorant
aVarangian@reddit
4k is great if you want to ditch TAA and still easily enjoy the image
1440 is fine if you can DSR to 5k or if a game has MSAA
1080p/2k is something you get 2nd hand for 50 bucks if you just need a few pixels
Wet_FriedChicken@reddit
Absolutely pointless imo. Games have gotten so unbelievably bloated lately, that even on improved hardware, 4k is far harder to run. It’s sad. I’ve been on 2k for years and I love it. No desire to upgrade. I’m sure it can look better, but my brain refuses to accept that 2k isn’t peak. It looks flawless.
wheeler9691@reddit
I have never at any point seen someone say 2K and mean 1080p unless they're correcting someone about 2K.
g0nk73@reddit
I have been fine at 1440p gaming for a while, but just bought a 42" LG C4 as it was on a great sale at Best Buy. Now I'm struggling a bit. Playing Dune Awakening at 4K I have to have everything on Medium or Low and get major stutters. (albeit, it's new and the stutters are reported to all people with any system)
My system: AMD Ryzen 7 5800X, Nvidia RTX 3080, 64GB Ram, 2TB SSD and the 42" LG C4.
I was hopeful, but I think my CPU is starting to bottleneck me, as when playing Dune for instance, the CPU is at about 90-95% and the GPU is only at about 60%. Other games look great though, WoW Retail is fantastic at 4K 120fps. Gonna re-download Stalker2 as I saw they patched A-Life yesterday, hopefully it looks fantasitc.
Middle-Amphibian6285@reddit
People are crazy, depends what frames you are looking for as well, I bought a 9070xt couple weeks ago, I play everything 4k now native @60fps or higher.
Cyberpunk I get 65fps 4k all best settings with ray tracing reflections
Helldiver's I get like 80fps
Dead Island 2 105fps
Final fantasy rebirth 70fps
I play on a 65" tv, it's fucking glorious looking
I don't really play competitive games anymore so I'm not caring for super high fps, long as I'm getting 60 minimum im happy, I care about the game looking the best and I couldn't be happier
I've seen people say helldiver's looks like PS4 trash.
Sorry but this looks amazing, very cinematic game.
https://youtu.be/H5EPqGq5V14?si=ooLascIDfccuB7KO
primaryrhyme@reddit
A 65 inch TV is a whole different thing and obviously needs 4k. The debate is more around 27-32 inch monitors on a desk.
D3moknight@reddit
I would be just as happy on a 1440p 120Hz monitor as a 4k 120Hz monitor. Upgrading from 1080p to 4k was night and day difference for me, but upgrading from 1440p to 4k is pretty subtle and unless you are just pixel peeping, you won't notice much difference other than lower framerates.
KindaHealthyKindaNot@reddit
Just honestly go 1440p OLED and you’ll never worry about 4K gaming again.
VoluptaBox@reddit
Depends on the game and what your expectations are. I do often play 4K on a TV with a 4070 super. I do a bunch of simracing and for those I target 120. For single player games I target 60. Stuff like RDR2 I run at native resolution and it looks great, CP I run with DLSS and it also looks great.
It's definitely a thing. Would I bother with 4K on a normal size monitor on a desk? Probably not. My main monitor is actually 3840x1600 at 38inch and before that I had 1440p at 27. Both looked great and never felt the need to go for a higher resolution.
asianwaste@reddit
In some shooters I often find it disadvantageous. What's a tiny dot on my screen is a sniper can appear to be far more distinct on a 1080 setting.
Goolsby@reddit
I've been gaming in 4k on a 3070 for 3 years now. People are still going to complain that there isn't enough frame rate but the resolution is what matters, 1440p has been my phone's resolution for 7 years, for a pc monitor that's pathetic.
Electrical-Bobcat435@reddit
Given your hardware, best cpu by far, that enables higher fps at lower resolutions where other cpus would bottleneck gpu.
Monitors are better now than your older TN in many ways. Anything is an upgrade. Oled displays fastest pixel response time would be tempting too but many good IPS.
Your preferences are what matters. But Id leverage system strengths, especially if mainly competitive gaming, by targeting fps and lower rez. What might this be is up to you and budget... Exceptional std 1080, 1080 ultrawide, std 1440... Moving to 1440 ultrawide is nearing 4k pixel esp superwide.
Catch choose, gaming tv and new monitors offering dual resolutions.
nandak1994@reddit
I ran a 4K monitor on a 1050ti, some games that work well at 30fps were doable with FSR.
Went to a 3070 laptop with the same panel and more games started working with FSR/DLSS. After experiencing 4K, I could live with the slower frame rates, but not a lower resolution. I mainly do photo and video editing on my PC and resolution is king for me.
My friend has the same feeling and he upgraded to a 4070ti desktop card for his 4K panel. That lets him get 60 fps on most titles with DLSS/FSR.
KajMak64Bit@reddit
I like to say 4K is for retro gaming... it's to play older games but 4K remastered
I tried 4K on a 65 inch TV and played like Call of Duty 2 and Fallout 3 and the games looked like a whole different game like installed an HD texture pack lol
GladlyGone@reddit
If your hardware is good enough, 4k gaming right now is great.
KajMak64Bit@reddit
nah bruh GTX 1050 is a 4K GPU but for games pre 2010's lol
scylk2@reddit
That's one of the pros of 4k I think. I play newer games sure but I also play a lot of older games or indie games
The_soulprophet@reddit
Moved to 4k last year from 2k and was disappointed in gaming. Productivity and internet things? It’s fantastic.
coldweb@reddit
About 2k sir
Chotch_Master@reddit
Even with a 4090 I don’t play at native 4k. I tried with TLou part 2 and the performance was pretty good. Solid 70-90 fps the whole time. But I tried dlss 4 quality and I literally can’t tell the difference between it and native. So with dlss quality (rendering at 1440 and upscaling to 4k) I get a locked 120 fps. Same results in stellar blade, and the gpu never maxes out
khironinja@reddit
I think 4K and 2K is not as big of a deal as people act like it is. 4K is nice and all but to save money, power, and just my own sanity and still get very clear picture, leagues better than 1080p is worth much more to me.
coolgui@reddit
I have a 9070 and play 2160p60. Many games natively at max, a few need FSR quality upscaling to be solid 60fps. But I'm fine with that. I play on an 85" TV like 15 feet away so it works for me lol.
JONNy-G@reddit
I made the jump a couple months back with a 4k monitor and 5080 (also bought the same cpu as yours), and it was similar to the experience of going from 1080p -> 1440p back in 2017.
Suffice to say it has been very nice! I can fully max out some really pretty games (RDR2, Days Gone, Helldivers 2) while on native 4k, but the latest releases will make you choose between that or framerate (I really like 90 minimum).
Stalker 2 was the first game where I felt DLSS was actually useful, if not necessary, but I did end up using it for Clair Obscur as it really does help the frames. Haven't played a ton of the latest games, but the Oblivion remaster was great (I only just made it out of the jail so can't speak for 100+ hour saves) and Nightreign/Elden Ring were sitting at the frame cap the whole time (though there was that one boss...).
One thing I will say: for anyone debating 144hz vs. 240hz, the monitor jump from 60hz -> 144hz was wayyy more impactful than what I experienced going from 144hz -> 240hz, and I basically never see those extra frames in my games unless I'm playing something quite a bit older, so you could probably save some money there if you're on a budget.
DreamClubMurders@reddit
I’m probably in the minority but I still don’t see the point playing in 4K. I don’t see many differences from 1440p to 4K other than a lot more power draw and lower fps
floobie@reddit
Personally, I want a 4K monitor for everything but gaming. I stare at code all day on a 1440p monitor hooked up to my work laptop, and the eye strain is real. The pixel density on my personal MacBook Pro’s internal display is way easier on the eyes, but I can’t use it for work.
So, when I’ve wondered if gaming at 4K is viable, that’s actually why. I think 1440p looks great for gaming, and I don’t really want to have to go completely ham on my next GPU purchase just to hit that mark.
I’m personally waiting a bit to see if 5K monitors catch on - 5K integer scaled at half resolution is 1440p, so I’d get the best of both worlds.
MetzoPaino@reddit
I’ve been playing games at 4k since I got my 3080 on release. It’s been great. I had to start playing on Medium type settings in the last year or so for the really pretty games but modern releases you can barely tell. I’ve now got a 5080, and it’s a champ.
Greedy_Bus1888@reddit
Its very feasible with good upscalers. Dlss performance is very good now. Fsr4 is not bad either. Also set everything to high is more than enough, no need to max
joor@reddit
More expensive :)
LilJashy@reddit
On normal sized monitors (32" and below) you can't really tell the difference between 1440 and 4k, except that you get a lower frame rate. Don't bother
SomeoneNotFamous@reddit
4K OLED is my sweet spot, can't go back.
But nowadays, games are incredibly hard to run, even on the high End side of things.
Going 4K , 60+ FPS Highest settings will cost you a lot, and upgrade more frequently.
I have a 9800X3D and 5090 , some games needs DLSS to reach more than 40 FPS (while not looking that good) , some can be enjoyed all maxed out with DLAA : The Last Of Us 2 is one of them and it looks incredible, runs perfectly fine too.
p1zz4p13@reddit
You have a 4k card, a 4k monitor is perfectly viable in your case. The hit in fps isn’t as bad as you might think and if you would like to sweat in competitive fps then just lower resolution.
I’m playing expedition 33, horizon forbidden west, Spider-Man miles morales, cyberpunk, bf2042 and cod in 4k off an rx7900gre and it’s crazy nice. Plus watching media and browsing is just all around so much better than 1440p.
With a 1440p monitor you can only go so far, but a 4k opens up to higher resolutions and in these days with fsr just do it and don’t look back. If it’s within your budget there’s no reason to go 1440p over 4k, grow up.
No-Opposite5190@reddit
4k gamings was fine back when I was using a 1080 ti.
cbntlg@reddit
I've been 4K gaming for over 7 years, now, and love it! I've been running a Cooler Master Tempest GP27U with an AMD RX 7900XTX and an Intel Core i5 12400, for the last two years and thoroughly recommend it.
ChipProfessional1165@reddit
The amount of people who are saying DLSS performance on 4k looks bad/ it’s bad because = fake are sped. Sorry, transformer model looks great and it sucks to be you to have such a mind limiting perspective.
coyzor@reddit
7800x3d + 4070 Ti Super here. No problems with 4k
ProgressNotPrfection@reddit
Depends on what you play, are they AAA, AA, or indie games? How many FPS do you need for immersion? Your rig should run indie games at 4k 120/144/165.
I can't speak to AA/AAA, I don't play their games.
You should be able to type your system specs and monitor resolution in to some apps online and select a game and it should estimate your FPS, maybe try that.
Here - https://pc-builds.com/fps-calculator/
bybloshex@reddit
It's slower, lower latency and harder to read
Xin946@reddit
Honestly, 4k is just for show really. A lot of new games you're better off with higher settings and more frames at 1440p than going 4k, it'll look and feel better. Also, just FYI, 1440p and 2k are different things.
cla96@reddit
it's fine for people that are okay with 60 fps and dlss usage, which should be the norm tbh, especially the dlss use, considering the level it reached.
TryingHard1994@reddit
I went from an old Asus 34 inch wide screen 1440p monitor to an Asus pg34ucdm 4k oled monitor and wauw I wont Ever be able to go back, but its probs the oleds Reason. My 4080 super is Working its ass off but It runs all titles at almost Max.
Zoopa8@reddit
Definitely viable in my experience. I've got a 4K LG G1 myself and usually just upscale. I've got a 4070Ti using DLSS, however, there may be fewer games that support AMD's FSR alternative, not just in general but also their latest version, which is magnitudes better than what they had previously, AFAIK. It's actually like on par with DLSS.
dazzler964@reddit
I recently made the switch from 1440p to 4K. Plenty of people have brought up graphical differences, so I'll bring up something else. Depending on the games you play, you might spend a lot of time reading text or looking at menus (think RPGs without voice acting or grand strategy games). I find text much sharper and easier to read in 4K, and have found my eyes are less strained.
janluigibuffon@reddit
2K refers to the horizontal pixel count. It is 1920x1080, also called 1080p or F(ull) HD -- 4K has twice the horizontal pixel count but 4 times the area.
For most people 1440p is still the sweet spot since it allows for higher pixel density than 1080p, with high refresh rates, while still being relatively easy/cheap to maintain. You can get away even cheaper if you're fine with 1080p, and more so if you're fine with 60fps. A rig like this can be build way below 750€.
PhattyR6@reddit
Can’t speak to how it compares to 1440p because I skipped straight from 1080p to 4K back in 2018.
Currently use a 3080Ti, which I’ve been using for 4 years and I play on a LG C477 TV.
I’m still very much enjoying 4K gaming. I use DLSS when available, depending on the game in either quality or performance mode. I generally play at 60FPS in most games, unless it’s multiplayer in which case I’ll target 120.
If I can’t run a game at 60 due to CPU limitations or a desire to play at higher graphical fidelity, I’ll cap at 30 or 40 FPS instead.
Games look great, I can’t complain.
DisastrousConcern415@reddit
Biggest problem I see there is the "TN" part, I would recommend getting an OLED, go for one of the cheapest you can find, regardless of the 1440p or 4k resolution, I'm pretty sure you would enjoy that upgrade.
iszoloscope@reddit
Expensive.
Mammoth_Substance220@reddit
I run PoE on 4K
Civil_Fail3084@reddit
1440p is the sweet spot for me. Especially when you can do it native
Antenoralol@reddit
Playing at 4K on a 7900 XT, older titles and MMO's though.
No upscaling / frame gen.
No interest in the slew of unoptimized crap that's called recent AAA titles.
notislant@reddit
Very expensive and potentially a very expensive slideshow.
I bought a 2080 for 1440 and I would have rather played it safe on 1080, I personally would not go above 1440 at this point if I got a new card. Unless it somehow looks alright downscaled to 1440/1080.
Games release as unfinished, even finished games can have horrendous performance.
If all you play are games like overwatch or valorant, you might be fine. If you play 'EA' games or even recent poorly optimized 'AAA' games? Might have issues.
themulderman@reddit
1440 isnt 2k.
modern basic pc resolutions are 4k is 3840 x 2160. QHD is 2560 x 1440. hd is 1920 x 1080.
1080p is 2k. They just changed the naming convention by changing from the second number to the first to make the resolution change seem bigger.
beigemore@reddit
DLSS does not look as good as real 4k. There is a “tinge” to it. It’s clear but it’s not.
CataGamer31@reddit
Honestly 1080p to 1440p is a way bigger jump than 1440p to 4k...I have played in both and honestly I ofc prefer 4k but the difference in fps is not worth it for the quality...Maybe get a 1440p oled?
_Rah@reddit
I have a 5090 and I still decided to stick to 1440p 480hz instead of 4k 240hz.
If you go 4k, expect to have to do more frequent upgrades and use more expensive hardware. Or take the resolution hit and take 2x FPS increase like I did.
No_Store211@reddit
2K 1440p 27inch is 🐐
Elc1247@reddit
4K gaming is much more easy to get into compared to before, but still requires a pretty high end system to get good framrates and quality levels.
The jump from 1080p to 1440p is MASSIVE. The jump from 1440p to 4K is very noticeable, but not anywhere near as big of a difference.
If you have the cash to splash, then its nice to get into 4K, but for sure, 1440p is the minimum bar for anyone but the most budget machines. You can find good 1440p gaming screens in the US for about $200.
iYrae@reddit
QHD on 27inch within a normal distance to your monitor will feel "retina".
You will literally see no difference, so better invest into a UWQHD or DQHD and experience a new level of immersion
ShadiestOfJeff@reddit (OP)
Yea possible option for me was to get a 27 2k OLED, sadly don't have the desk space for a uw
scylk2@reddit
I have the exact same conundrum mate. 27" 1440p OLED or 32" 4k OLED. Currently have a 4070ti. I can afford to upgrade to a 5080, just not sure if it's worth the money and framerate hit. 32" sounds amazing for gamepad games but I'm a little concerned it might be too big for mouse keyboard games
CadencyAMG@reddit
I had the same dilemma last year and after trying both, i stuck with 32in’ 4K 240hz. The picture clarity was just too good and i currently play mostly CS2 and OW2 on it. Also if you use your monitor for productivity at all, the text on 1440p OLED at 27in looks unbearably smeary or fringy.
scylk2@reddit
Thanks for the input! I think I'm gonna pull the trigger.
For productivity, you're not too worried about burn in? Was planning to keep my old monitor for that
CadencyAMG@reddit
Honestly not at all. I do have the Geek Squad replacement plan to be fair incase that does happen, but I haven’t seen any signs of burn in on my Samsung G80SD since I got it back in August. Keeping a LCD display for it wouldn’t hurt though if you are worried about burn-in. I use the 32in OLED with a vertical 27in 1440p IPS for my workflows, but my code editor is always just static on the right or left half of my OLED.
scylk2@reddit
Nice, I'm in Australia, the G81SF is on sale at USD 915 atm 👀
I'm a dev as well, good to know that the text fringing is not an issue in 4K
CadencyAMG@reddit
Hell yeah text fringing isn't an issue at all at this resolution. That price is pretty good as well. If I remember correctly it's the same one as the 2024 G80SD without the Tizen OS which has all the smart tv stuff if you arent using it.
CadencyAMG@reddit
QHD 27’ is nowhere near “retina” levels of sharpness even at normal viewing distances
rost400@reddit
Everyone worried about 4K, meanwhile I'm still enjoying games just fine on my 27'', 1080p monitor.
Kofmo@reddit
I prefer 1440p, i would rather not have to rely on frame gen and upscaling, and i like running 140frames+ in shooters.
My dream monitor would be a dual mode 1440p / 4k but those resolutions dont mix well and i dont want 2 monitors :-)
jasovanooo@reddit
been running 4k since 2015 and 4k120 since 2020
its been great.
BothElection8250@reddit
I've got the same cpu/gpu you do and have been using a 4k monitor for a little over 10 years. AAA games at native is basically only going to happen if it is very optimized. Indie games and older titles at 4k though are still an absolute treat. It's always fun going back to a game and seeing it at such a high res and frame rate. Of course it's also good if you enjoy watching movies or YouTube in 4k.
StolenApollo@reddit
If you’re running 32” or above, 4K is simply a necessity, imo. That said, I think the ideal monitor size is 27” and for that, having used all 3 major resolutions, I think 1440p is the most valuable. The sweet spot is 1440p monitors at 240hz or higher but 144hz is also really good for 1440p. With how heavy modem games are, it’s just not worth getting 4K and then getting terrible frames for some games (for an average user without a flagship card).
I also like that, while my PC can do a lot with a 1440p monitor or 4K, my laptop can also make good use of 1440p.
Busy_Ocelot2424@reddit
It’s better, there are now a handful of cards that are totally suited to 4k. And then there is I’d say about another 6-7 cards that can play 4k decently well, but are more of a high end 1440p card. But you have to accept that upscaling in 4k is going to be needed sometimes and theres just no way around that. And frame generation can help as well. Lowering raytracing settings can be a boon of fps. How many cards are there where you can just do whatever you want in 4k and theres hardly any problem? 2. You know which ones.
sanjeevfibrous@reddit
In 2025 2K and 1080p gaming remain the most popular choices because they offer a great balance of performance, visuals, and affordability. While 4K gaming delivers sharper detail, it requires expensive high end GPUs and costly 4K monitor something most gamers can’t afford. In contrast, 2K and 1080p are far less demanding, support higher frame rates, and still look excellent, especially on mid size screens. For most players, especially competitive and budget conscious ones, these resolutions provide the best overall gaming experience without breaking the bank.
Username134730@reddit
4k is fine but upscaling is usually necessary in order to maintain acceptable frame rate.
AustinsAirsoft@reddit
I went over this dilemma for a while when building my first proper system. Almost every situation, 2k (1440p) was economical and still high quality and 4k ended up being not worth it for the price jump and (in my eyes), miniscule quality increase from 2k- especially on something like a 27in screen.
Dante9005@reddit
I personally went this year from a 1440p 165hz display to a 1440p OLED 360hz display and that itself was insane. OLED is enough of a difference to just stay at 1440p an fps stays higher since 4k while easier is still harder to run with all these unoptimized games. I do have a OLED 4K TV also and I can say that between 1440 and 4k the difference isn’t massive. I’d just get a OLED 1440p monitor is what I’m sayin.
Sopheus@reddit
I'm with 13900 64gb 4090 and still on 2k. I have a 4k tv set and tried modded CP with all maxed out on both. If you have high requirements towards fps and graphics fidelity then 2k is the way, in 4k fps tank badly (it's not acceptable to have 60 fps or lower in 2025, after you experienced anything higher). Considering your set up, it will be even worse.
If you still eager to try, I would advise to check out AW3425DW or its previous generation with the lower refresh rate, but the same panel.
Sopheus@reddit
I'll wait for 6090 before jumping to 4k, same as VR and Pimax Crystal Super
volatile_flange@reddit
2 times more k
mixedd@reddit
Depends what you're after, native 4k gaming on latest titles and love to turn settings to 11? If you don't have 4090/5090 don't even attempt it.
Are you fine with upscaling? Maybe with frame generation? It's doable pretty much, tough in some cases 4k loses its charm as some devs can't have a clue how to jot make their games blurry af.
Xp3nD4bL3@reddit
4K is easy to achieve if you don't crank all settings to the right. If you do crank it, DLSS should be on. Note that Ray Tracing is a whole other beast 😄
Archimedley@reddit
expensive!
If you want to play new games at 4k, you are going to want like a current gen -80 level of performance
So, instead of going like 3-4 years between gpu upgrades for new games, it might be like 2-3
Like, it's doable, but it's kind of a big commitment if you want to play new games, unless you're ok turning the render resolution way down
Chezni19@reddit
GPUs can't do it, so you have AI make up for it in various ways that no one loves.
hihoung1991@reddit
It really doesn’t matter
SwordfishNo9878@reddit
I have a 1080p monitor. Maybe one day I’ll do 1440p but honestly I don’t ever see myself doing 4k outside of whatever my console does.
scylk2@reddit
Op: 1440p or 4k?
Guy with a 1080p: My opinion is relevant 🤓🤓🤓
No_Guarantee7841@reddit
Technology did go very far, to the point of not needing to torture yourself with a TN monitor if you want high refresh rates/
VeraFacta@reddit
Dead serious, I have been gaming in 4k since 2015. For 95% of games it is above 120fps. Very few games needed to ever be optimized. For the few games that did not run well at 4k, 1440p was acceptable. 1080p is 27+ years old and needed to die a decade ago.
Jbarney3699@reddit
4k DLSS is where it’s at mostly due to game optimization.
shadowshin0bi@reddit
For the more demanding games, 4K @ 60fps with your setup is more than reasonable to expect. Optimized games will perform admirably
Going from 1440 to 4K though, it will come down to the size and viewing distance of your desktop monitor to determine if the increase in PPD (pixels per degree) is worth it
If you do upgrade, might be worth looking at other features like HDR, OLED, etc for overall increased visual fidelity. Getting high frames rates will be easy in most situations @ 1440p, but 4K becomes a struggle since it’s more than double the pixels being rendered (before DLSS and other tricks)
Nikorasu-chan@reddit
As others have pointed out it's mainly the game itself rather than the hardware. Even with my 5090 I struggle in some games at native 4k just because of the poor optimization. DLSS/FSR and frame gen definitely help alleviate that, and have gotten pretty good at maintaining quality while upscaling/generating frames. It's definitely off putting for quite a few people including me somewhat. But I also use a 4K 240hrz QD-OLED so frame gen with it does wonders.
It also matter if you yourself can notice or even care about the quality difference. For a lot of people 2k is preferable as they can't see the quality difference vs 4k and or don't wanna take the fps hit.
Tldr; the hardware itself is pretty good but the game optimization lately has been terrible regardless of what resolution you're playing at. It's up to you where you think it's worth it to maybe get enough frames that you want or not.
Traditional-Track578@reddit
If you wanted to 4k game should've saved the CPU money for the screen. X3D only good for 1080p.
Despite the contrary, 4k gaming has been fine since the 6000/3000 series GPUs were released, especially with the advent of DLSS and FSR.
Playing at ultra at 4k is a literal waste of time. High at most but medium is more than enough quality for games these days.
People keep chasing that ultra horse when it's not needed.
I only game at 4k and went 7945hx but keeping my old 6900xt as it still pushes enough frames for modern games.
Effective_Top_3515@reddit
Think about it: once you use upscaling in 4k, what resolution does the GPU render at? 1080p? 720p?
Traditional-Track578@reddit
Quality on both fsr or DLSS starts at 66% monitor resolution for 4k which is 1440p. Still don't need an X3D chip.
Balanced is 58% or just shy of 1300p again same as above.
If you are going lower than that I'd question why you are bothering with 4k
Effective_Top_3515@reddit
The lower the dlss quality, the less the GPU has to work. Check out hardware unboxed’s x3D vs x comparison video when upscaling.
The 5800x3d was 17% faster than the 5800x in one of the tests.
Active_Ordinary_2317@reddit
You’re mistaken about the CPU. As resolution goes up, the GPU load goes up while the CPU load remains the same until GPU bottleneck is reached.
Traditional-Track578@reddit
Plenty of benchmarks around at 4k showing a 7600x pulls the same frames as a 9800x3d at 4k. With a 5090.
There's maybe 10% difference in 1% lows but that's not worth the price increase.
Active_Ordinary_2317@reddit
You are right in this reply, but the original reply states that the x3d is only good at 1080p. That implies it as bad at 4k, which it definitely is not.
Traditional-Track578@reddit
Sure okay the wording wasn't clear.
The value of an X3D part reduces as you move up in resolution. They are as good as everything else at 4k, which value wise, isn't good.
Active_Ordinary_2317@reddit
Again I agree with you, but OP is asking how well his existing PC will handle 4K. CPU value is a past concern of his.
HealthySir4827@reddit
I have rx9080 xt, 9800x3d and 4k lg c2 oled. I get over 100fps almost every game, space marine 2 was only game that was +90fps. Re 4 remake maxed and rtx on, whit fsr i did get locked 120fps.
Witch_King_@reddit
Wowee, the unreleased 9080xt!!
Melodic_Slip_3307@reddit
9080 XT? what
Witch_King_@reddit
They made a typo and I was having a bit of fun
henconst796@reddit
27' screen and below, no point going for 4k. Even 1080p is still relevant on that screen size. Don't spend too much hoping to get good performance with these days games which perform like shit