If I'm a 1080p gamer, will a 7500f and a 5060 ti/5070 basically set me for the rest of the PC's life to play basically anything I want at highest settings?
Posted by PriceOptimal9410@reddit | buildapc | View on Reddit | 118 comments
I've been researching places to buy parts for a PC here recently. For most of my life, I used to run with integrated graphics, till I got a PC with GTX 1650. That PC broke and I had to get another PC with integrated graphics as a stopgap, which I upgraded with a 6900 XT early this year. However, defacto I was still locked out from playing many demanding games since the PC would shut down all of a sudden whenever playing something too demanding.
Anyways, I had to leave that PC since now I study abroad, so I look at PCs here now. I noticed that there's a lot of good builds here for good prices.When I search the GPUs and stuff up, however, a lot of the discussions and benchmark videos I see center around 1440p and 4K gaming, and the setups I described often get good fps on high settings even on 1440p.
I've always been playing old or less demanding games on low to mid settings for often 30 fps for most of my life, and that's how I play now on my laptop as well. For me, I don't know how much better 1440p is than 1080p, but I think I'd rather not ruin 1080p for myself, just in case. I have bad eyesight anyways, so it may not even matter. Also, I care more about frames due to this, because it's frankly irritating to play a lot of games for less than 40-50 fps. I can tolerate lower settings, but low frame rate is just irritating. Being able to play at high fps while also on higher settings is a major boon for me, and surprisingly, even the cheapest PCs with dedicated graphics are now apparently able to do this.
It's a bit bizarre to me, in a positive way, because just 5 years ago, getting something that could play demanding games like cyberpunk 2077, for 60 fps on 1080p and medium or high settings, felt like a huge investment my family couldn't afford. My 1650 struggled to run it on lowest settings at like 30 fps. Yet nowadays it feels like the average budget PC can easily handle games like that on respectable settings at smooth framerates on 1440p, let alone 1080p.
I might have been going on too much of a tangent there, but anyways: is a 'basic' rig (as basic as stuff get nowadays) now actually enough to basically enjoy absolutely anything I'd like at 1080p? That used to feel like a dream before, but apparently now I can just get it for a decent price. Am I missing something, or have PC parts really just come that far?
gankernation@reddit
ok first off this computer setup will be great for 1080p gaming. moving up to a ultra wide monitor at 1440p might not be as bad as you think. Try going to a retail store and look at the ultrawide monitors. to see if they fit your liking.
the 7500f and 5060ti should be more than plenty of power to run all modern games and future ones to come without too many issues. First thing you'll notice as many others do jumping from a crappy computer with lower settings in games to maxed out settings on whatever resolution your running will be night and day difference!
PriceOptimal9410@reddit (OP)
Good point. I will try going to a store and look at the monitors, 1080p and 1440p, both with and without glasses on. Nowadays I do game with glasses on if I can remember to wear them, but I often forget or am too lazy to do so, since I can still game just fine without them. Just won't be able to notice as much details.
And yep, the upgrades that I make are going to be like day and night. I used to be accustomed to integrated graphics for so long, most of my favorite games are less demanding or indie ones, or older ones, and basically anything from even the mid 2010s was guaranteed to be a terrible experience because of unplayable fps. My first computer eventually got to the point where it couldn't run NFS Carbon (2006 game) at more than 1 FPS, on the menu itself, lmfao. My upgrade to a PC with a 1650 felt elite right there. And my 1650 is honestly nothing compared to most GPUs that are now marketed as budget options...
Honestly, while people have lamented how badly Nvidia has fallen off with GPU upgrades recently, I feel like, for someone coming from the third world in particular, this recent generation of budget 30, 40 and 50 series GPUs have really made modern AAA games way more accessible, when previously it felt like you needed to have a PlayStation or an expensive rig to play those. Post-covid years felt pretty good in terms of prices, until recently with the RAM price surge, I suppose
ryrydawg@reddit
My build is 7500f / 32g ddr5 / 5060ti / 1440p ultra wide monitor . Zero issues running any games so far. I recently played the new mafia game on ultra settings using dlss quality and hitting around 100+ fps . Just stay far away from ray tracing
WazeJinxIsTaken@reddit
is 16gb single channel really bad? current bundle options offer 16gb single, for dual channel the prince is too much
TokyoRock@reddit
I'm using a 1440p ultrawide with a 5060 Ti 16 GB, and it can handle 60FPS at max settings. Above 60 FPS depends on the game.
gankernation@reddit
Sounds like the fps is locked or you have vsync on. Try unlocking the frame rates in settings.
TokyoRock@reddit
I actually use vsync on purpose sometimes to limit FPS to keep my computer fans quiet lol
SnooPickles4476@reddit
a 21:9 ultrawide at 1440p will be fine with those specs, ive got a system with reletively similar performance (5700x3d, 32gb ddr4, 3080ti and at 75hz) and ive not had to turn the resolution down once.
gankernation@reddit
Yeah I just got a 4080 super recently. Upgraded from a 3080 12gb variant. The 3080 was a good card and ran everything great at 3440x1440. I got a 5950x,2x32gb ddr4. Max settings on all game.
welcometogboardclipb@reddit
Highest forever, no. High settings with no RT that still looks good, probably
MagicTurtle3D@reddit
24" is for 1080p 27" is for 1440p
27" 144Hz 1440p. I am never going back. Should you? Most definitely. Just don't get a 27" 1080p monitor, the pixel density is rather not enough. The 5070 is a solid card, it should perform, it really depends. Define - anything.
randylush@reddit
Exactly. I truly don't understand people who build new PCs around 1080p now. It makes so much more sense to upgrade your monitor first, then upgrade your rig around it if it doesn't keep up. Because with 1440p you can always render at 1080p and upscale, and it will still look better than 1080p native.
Although to be "fair" to OP, he did say that he has vision problems and doesn't always wear glasses while gaming... seems like the cheapest way to enjoy gaming for him is to simply put his glasses in another room
Nstorm24@reddit
The fact that you cant understand why some people would try to build around 1080p shows how privileged you are. In most places around the world gaming PCs are a luxury. And reaching 1080p at 60fps in most places is the everest some people try to reach.
randylush@reddit
What I was trying to say is, generally a used PC is going to be a much better value than new if you’re aiming for 1080p. But yeah I’ll make sure to take my white guilt into account before posting next time
Nstorm24@reddit
I dont get the white guild part. Its kinda weird how much you guys care about color.
randylush@reddit
I’m pointing out how weird it is that people bring up “privilege” when talking about 1440p vs 1080p
Nstorm24@reddit
Actually comparing both of them has nothing to do with priviledge. But saying that you don't understand why people would build a pc trying to reach 1080p is the priviledge part.
Not understanding that there are many factors (availability of parts, money) that can limit what someone can build is the priviledge part. In some places a single gpu is worth like 3 to 4 months of salary including the extra tax.
And throwing that "white guilt comment" just shows more of said privilege. The rest of the world have too many real problems to care about white/black privilege.
MagicTurtle3D@reddit
5070 is very 1440p capable. On older titles and newer ones. With the "Quality" setting it will run 2k at 80 or 90% resolution. OPs build is far from "poor". 😄
VitalityAS@reddit
No not at highest settings. But highest settings dont matter much anymore. Ray tracing is an fps dump. Also UE5 games are all really hard to run even with better hardware.
G2Keen@reddit
That's something only you can really decide. I've been on the 4070ti for a few years and have no issues, but I also don't run anything on max if I can't get 144 fps at 1440p. 1080 would reduce demand, and while I understand not wanting to see something better and ruin the cheaper alternative, it's more likely that products will hit a point that 1440p is the norm automatically. 15 years ago I bought my first 40 inch TV for like 600 on sale at a place I worked at. I saw one at Walmart last week, a 65 inch for 399. Prices are funny.
PriceOptimal9410@reddit (OP)
Yeah, despite prices sometimes fluctuating with shit like covid or now AI companies driving up RAM prices, stuff just seems to be getting overall cheaper. And admittedly, budget GPUs seem to be getting powerful faster than modern games are getting more demanding (aside from some of the true beasts like Alan Wake or STALKER 2 or that Indiana Jones game that requires ray tracing...). So the same is probably gonna run for monitors.
Of course, I do live in a place where this stuff is more expensive than in the West, and me being able to afford a new PC in the first place admittedly puts me above the income bracket of many other students, so it will be a while before I'm seeing others with 1440p monitors
G2Keen@reddit
I wouldn't say 1440p is the average till for even the west, hell, most people that own PC's run pretty junky systems, but I think even just being on this reddit asking likely puts you at least in the ball park of those that would want to run something nicer.
lsdstoned@reddit
If you're going to buy a new monitor, it's much better to go with 1440p, as the price difference is so small. An RTX 5060 Ti 16GB or 5070 will be excellent, and the Ryzen 5 7500F is even more effective at 1440p than at 1080p. With the new DLSS 4 Transformer model, DLSS Balanced is the sweet spot for 1440p, and even DLSS Performance looks excellent. Since 1440p DLSS Performance is equivalent to 1080p DLSS Quality, that's the lowest you should go. It makes 1080p feel obsolete and vastly inferior.
bubken99@reddit
The 5070 will be a good 1440p card for at least 4-6 years. That card may legitimately outlive 1080p lmao
Alarming-Song2555@reddit
Absolutely, yes.
Don't listen to the people screaming otherwise, it's overkill. My previous build was a 3070ti and that badboy easily rocked 1080p at fantastic levels for almost every game and predominantly max settings.
5070 is a much more powerful card and for any singleplayer titles, has the benefit of frame generation.
PriceOptimal9410@reddit (OP)
Honestly yeah, looking at actual benchmarks for even later modern games, that card indeed seems to be pretty overkill when considering that a LOT of the games I will play will never even push the card to it's 100%..... My previous 1650 is what I compare it to, in my mind, because that is the first time I was able to play the stuff I actually did like and had the storage for (mostly mid-2010s games). Maybe it's actually storage I should be worrying a bit more about, than the card itself..... I very barely get the bigger, more demanding games solely because of how much storage they take up
Alarming-Song2555@reddit
Get yourself 2TB and you're cruising. You will never run into a storage issue unless you're the type of person that doesn't uninstall games for 5 years after finishing them.
Old_Two1922@reddit
Depends on what games you play, but yes, 5060ti for 1080p should be plenty. My 3070 still runs pretty much everything I want with some tradeoffs in graphical fidelity.
Built a PC some time ago (not mine) with a 7800xt for 1440p. Is a higher end gpu and 1440p sweet as all heck? Yeah, it is, but it just costs more.
PC parts have come quite a long way in terms of performance. Honestly if you have the money and the regional prices make sense, I’d say go with a 5070 and you are set for a good long while.
Miyul@reddit
Yes. Dont believe any FOMO users in this sub. They are all dumbasses who think you cant enjoy playing video game without rtx5090 oc 7800x3d 240hz 2k oled monitor
Jackal-Noble@reddit
Ding Ding Ding!! Correcto Mundo
TomTheNothingMaster@reddit
I have 240hz 2k oled while not even close to this setup and im doing absolutely fine in games and i enjoy few of them at entirety of what my monitor has to offer. No one needs a damn 5090 tbh
Synthoxial@reddit
Not even a 9800x3d what a fucking garbage setup
ghostsilver@reddit
absolutely agree, users here are all like "your pc is absolute trash if you don't buy C30 ram instead of C32"
SickOfUrShite@reddit
Actually i think you can't enjoy it without 9950x3d 5090 240hz 4k oled monitor, just personally
PriceOptimal9410@reddit (OP)
Tbh, for me, a guy coming from Maldives where the PC parts are ludicrously expensive, even just 60 fps on 1080p for any modern game is elite. I grew up with crap that couldn't play old games at more than 30 fps so I guess I have relatively low standards... Plus I do like to stick to a budget. Now that I'm in Malaysia, though stuff is still more expensive than the West, it's still somewhat more affordable than back home.
For me, even a 3050 would be a big upgrade, considering I'm gaming on a laptop. Though, I do still get concerned because some games have truly wacky requirements these days. I want a PC that can play the modern stuff with 60 fps at some setting, because I've felt like I lost out on a lot of good games simply due to not being able to run them.
Miyul@reddit
lol im malaysian, hope you have a good time here. A 5060ti/5070 would do absolutely fine for you at 1080p. I mean the gpu can run a lot of modern triple A games at 2k 90fps stable, I wouldnt worry about 1k like at all. These obsessed pc users think everyone is running 4k monitor ray tracing max dlss everytime we playing video games when were not.
randylush@reddit
Exactly. Then when you look at Steam hardware surveys, the vast majority of people are just happily running low or medium spec hardware at 1080p.
nokei@reddit
I like my monitors 21/22 inches but overtime I've been forced into 23/24 I'm not buying anything bigger than that so the pickings are slim for fancy shit think I've seen one OLED and it was some random asus touchscreen portable thing.
SKTT1_Bisu@reddit
If you can afford the 5060ti 16gb version or 5070 you'll probably be fine for at least the next 5 years.
a3poify@reddit
I’m happy with my 7800XT/7800X3D as a future proof 1080p setup. Can hopefully squeeze years to come out of it
_Flight_of_icarus_@reddit
I realize this is an enthusiast sub, but the more modest of hardware I try out with games, the more surprised I am at how well it can hold up against far more expensive or newer options. Plenty of "bad" CPU/GPU options can still offer a good experience.
I mean sure, a 5090/X3D CPU setup is going to smoke everything else - but this implication that anything less isn't worthy is ridiculous.
YouKilledApollo@reddit
Some people are indeed crazy. You'd do just fine with AM4 and DDR4 today, and no, PC gaming on your own hardware isn't going away. It seems like this sub is filled with PC users who just got their first PC this year and never experienced any of the previous shortages.
FeralSparky@reddit
The problem today is Nvidea significantly cutting production for consumers and buying every piece of silicone they can get their hands on. This is different then any other shortage I have seen in the last 38 years as its only going to get worse until the AI bubble bursts.
YouKilledApollo@reddit
No, that's not a difference, happened before. What is different, is that pressure from US is making Korean manufacturers not sell off their old DDR4 equipment to Chinese companies, since the US threaten with retribution if that happens.
Thank to US geopolitics, DDR4 essentially stopped, instead of being picked up like the previous generations. This will make this shortage worse, but not impossible.
If AI is a bubble or not has nothing to do with, companies have bought +40% of the available supply before, high demand isn't a unique or new thing. The US forcing the supply to decrease, is new however.
Symphonic7@reddit
Haha thats actually my build except for the 5090
-Kalos@reddit
There's a bunch of framerate princesses in here. I spent nearly $4k on my gaming build this year and I'm enjoying it just fine on my 120hz TV. It's a waste but 120hz 4k is just fine. For midrange rigs, most settings don't even need to be ultra and raytracing isn't even necessary plus DLSS does a decent job. Plenty fine even at 1440p if you have to
VanillaThrowAway8@reddit
I feel this comment. I spent around $3k on building a new computer as well earlier this year. Went with a 9800x3D and 5070ti to also use on my 4K 120hz TV. Someone commented that a 5070ti is “not a 4K capable card” and it had nearly 50(!) upvotes in another thread around the time. Meanwhile I’m getting native 60FPS+ in 4K amongst a lot of AAA titles and instantly 120fps as soon as I put on DLSS without FG, and I haven’t even OC’d my card once yet to get close to a base 5080 although I could. So much ignorance in this community and gatekeeping, or just flat out ignorance. People think you need a 5080/4090/5090 to push 4K.
insufferable__pedant@reddit
This right here. A few days ago someone on this sub was asking if it was possible to build a 1440p machine for under $1000. I threw up an example of a build that I literally just put together for my living room, and somebody replied telling me that a GPU with just 8gb of VRAM was totally unusable for modern games. He never responded to me when I shot back that I had been doing just that and having a great time with my 3060 Ti until I upgraded earlier this year.
Honestly, I think a lot of it has to do with people just hitching their entire self worth to this hobby. Let's face it, a lot of us who get into gaming can be a little socially awkward (I count myself amongst this). Some people just can't figure out how to be comfortable with themselves and cultivate a social circle of people who value them for who they are, so they lean into their hobby - sometimes in very unhealthy ways.
K9Imperium@reddit
This. I got a 5070 earlier this year and for what I paid and what I got I'm very happy, and I play in 1440p.
SavedMartha@reddit
Yes
tyrannictoe@reddit
Highest settings now include path tracing in some titles. These cards will not be able to pull it off even at 1080p
PriceOptimal9410@reddit (OP)
Fair point. Some games apparently even demand ray tracing as mandatory. For the ones that don't, though, I think I will be somewhat okayish not using ray tracing. Or reducing settings as needed to get 60 fps. It's not that I don't want nice settings, of course, but what I truly need is at least 60 fps, with higher settings being a nice bonus thrown on top
Ok-386@reddit
Ray tracing isn't the same thing as path tracing. IIRC path tracing would cost me around 50 FPS less vs Ray Tracing in psycho mode when playing CP2077.
BananaBot6@reddit
What build do you have for this? As someone very into CP2077 raytracing/pathtracing
tyrannictoe@reddit
Some titles like Indiana Jones and Doom TDA always have RT on. Some titles are not built without RT but allow you to turn off RT anyway, which will make graphics look really bad. For example, AC shadows and Silent Hill f look very bad if you turn off all RT.
PriceOptimal9410@reddit (OP)
Oooof yeah that's a fair point. I didn't know they made some games built with RT in mind now, even if you can disable it.
Well, I suppose I'm fine with that, since I won't be playing those games much, and I would still have access to most other games from even the 2020s. Before, I'd be stuck with exclusively early 2010s titles at max. Thanks for bringing that up though, it's something I will keep in mind
Marek_Key@reddit
But raytracing still works great on low end GPUs from newest generation - so 5060, 5060ti, 9060xt. I've played Doom The Dark on highest settings with 5060ti 16GB on 1080p and it was smooth as butter, but when playing on my 6700xt on 1440p game wasn't as smooth as Eternal and it wasn't as crispy - mainly because of FSR 3.1 and raytracing performance on RDNA2 and RDNA3 cards. It is a lot better with RDNA4 and FSR4. I wouldn't bother with PathTracing tbh - it demands a lot of power, cuts performance much and who really can afford that kind of hardware?
LouBerryManCakes@reddit
Yeah my 9060XT 16GB runs Doom TDA 1440p at over 120FPS, as long as I leave the path tracing options off. If I turn them on, it falls to its knees with framerates in the 40s.
Whatever RT is "always on" really isn't a problem at least with this generation of cards.
-Kalos@reddit
Path tracing is pretty horrible even on my 5090. The application for this just isn't there yet. No need to base your purchases around a feature most won't use anytime soon. Shoot a lot of games don’t even have raytracing and they still look good
MrTomatosoup@reddit
5060ti will be fine
Islu64@reddit
There is no guarantee for that to be the case.
Nobody knows how hardware requirements will evolve, not even on the short term.
A mere 5 years ago nobody expected games to release with the horrible optimization they have now.
Will you be able to play anything you want at 1080p 60 fps on highest settings at the moment with that configuration? Absolutely.
Will that be the case in 3-4 years? Nobody knows.
PriceOptimal9410@reddit (OP)
For new games that come out later on, certainly, but I think I generally don't tend to play new stuff. Most stuff I do play tends to be at least a few years old, for the newest stuff, or overall stuff from the 2010s when I had a GPU at all.
Getting a new rig, imo, for my thought process, pushes the amount of games I can play from flash games and old 2000s games on my laptop, to basically almost everything that exists right now.
I will have to make sacrifices for some later games, of course, but I generally don't mind lowering settings to get a smooth framerate. It's still better than an igpu which can't run most later games (or even many mid 2010s games) at more than 30 fps at the absolute lowest settings. Comparing to that standard, a 5060 ti and 7500f build feels like a huge upgrade indeed.
postsshortcomments@reddit
I personally think that you've gotten a lot of overly optimistic responses. Back in 2020, I remember seeing the exact same "1080p destroyer that should always be able to play 1080p" when people were discussing GPUs like the 3060 Ti and 5700XT. When you see that phrasing used in the building community, what you should read is "it will probably be able to play most titles at above average settings that look good for a 3-4 years, then it will start reverting to a medium settings GPU, medium-low GPU, low settings GPU etc., for new releases"
People tend to forget that for every 1% improvement a card sees, in 3 years the majority of that 1% or even more than 1% of its performance dedicated to de-optimization. I don't necessarily consider de-optimization to be purely a bad thing: it lets developers complete projects quicker, with lower budgets, and lets absolutely fantastic indie titles hit the market that wouldn't have been feasible otherwise. For instance, one of my 2025 GOTYs is a Runescape-like title called Black Grimoire and it's between OSRS and RS3 in graphical quality. But even my 4070 Ti Super & 7600x struggles on any preset besides "low." (With the right settings for things like Draw Scale it looks virtually indistinguishable from HQ and runs smooth/well). If we go back to the original RS build that could run well on a Compaq from 2000's iGPU, it definitely does look a small bit better - but a 4070 Ti struggles with it on some settings.
Further, ultra in 2025 is not ultra in 2020. You add things like RT to the equation. You add performance killing, totally-for-demo purposes experimental gimmicks to the equation. You add VRAM-intensive higher quality textures to the equation (ultra textures on 2010-era title is probably sub-low these days). Further, you add titles that pretty much serve the purpose of selling GPUs that will never perform great on anything less than a modern GPU without those modern GPU selling features disabled.
If we look at the 5700XT now, which what the greater PC building community used to call a 1080p destroyer that'd forever be able to 1080.. it's still a really decent 1080p card, but it's certainly not a 1080p ultra card. I say this as a builder who regularly suggests two main builds: 3060 Ti for ultra-budget (which is basically a superior 5700XT with early DLSS) or 9060XT/9070XT. But my disclaimer when still suggesting either the 5700XT (it's a little cheaper) or the 3060ti is that both GPUs is that it will definitely be unable to run some titles, run other titles at 1080p below or around 60FPS, and I'd never come close to calling it an ultra title.
With the expectations you seem to have in your posts: I think it's fair to say your needs of 60-FPS, totally okay with lower qualities needs are fairly realistic for the performance of a 5060 Ti in 5 years. If you don't gravitate towards new releases and often stick to optimized multiplayer titles or don't care to skip over them (I am much the same), even better. On the same note, when you say "for the life of the PC," what is the life of the PC? My 4670k build in 2012 is still enjoying its retirement as a non-gamer friends browsing and social media PC and it's doing an extremely good job. For that purpose, 12 years could be one persons definition as "the life of a PC." DLSS does help extend its longevity (much like we're seeing with the 3060 Ti, but DLSS on a 5060 Ti is better than DLSS on a 3060 Ti) but at any point companies can arbitrarily drop early gen DLSS support (I'd especially expect to see this with titles being pushed by GPU companies). This should be mostly fine for single-player titles as you can probably use some type of third-party (but never count out DRM), but for multiplayer titles never count out anti-cheat services potentially flagging it.
So while yes, I'd consider it an incredible 1080p card with DLSS that should push it well-beyond what we saw prior to DLSS.. we've never seen a 1080p card without an expiration date. But we've definitely seen and heard majority opinion that prior-gen cards would be forever 1080p cards.
PriceOptimal9410@reddit (OP)
Hey, thanks a lot for that detailed write-up. It's intriguing advice I appreciate getting. I also learned some things.
So, as you mentioned, if I don't really care much for playing new releases, I will mostly be fine. You are spot-on on that, because even with graphical upgrades in the past, it's been somewhat rarer for me to get later games that would actually test the GPU to the max, simply because of how much storage they take up. It will be less of a problem now, since those previous PCs were also used by family to store a ridiculous array of things (dozens of gb of pictures, work files, etc. Mostly pictures though). Now since I'm alone, it will solely store *my* stuff, but I might still not play recent titles very often. I've developed a taste for older games, as mentioned before because of being stuck with mainly igpu for most of my life, so my standards for graphics are also pretty low.
I'm also almost entirely a singleplayer guy, since I wasn't allowed to play multiplayer most of my life and even now I don't have that much interest in picking them up (partially due to the huge lag and ping I encounter in any multiplayer games I did try, due to my location). So admittedly, I might encounter more demanding games.
For me, I view the new PC as basically something that pushes up the amount of games I can play from just 2000s stuff and some indie games, to basically almost the entire catalogue of stuff that's come out till now. So in that way, I'm pretty glad, even if future games will require compromises to play.
For the life of the PC, basically what I mean is as long as it doesn't break and/or start losing performance massively. Even after more than a decade, when it's ridiculously obsolete, I think I can still sorta use it for playing stuff that came out in my teens and young adulthood, unless my standards for games do rise a lot in those years to make me enjoy those older games of my childhood less. That is all assuming, of course, that I even play video games at that age.
Anyhow, I will keep what you mentioned in mind and adjust my expectations accordingly.
Islu64@reddit
You specifically asked if you would be able to play on the highest settings, not if you would be able to play anything no matter the settings.
PriceOptimal9410@reddit (OP)
Fair enough. I do view high settings as a pretty nifty bonus, so I do like to have a game at settings as high as possible while getting good framerates. Aside from that, the framerate with the highest settings on modern games are also a proxy for how good the card will be, which is my thought process. If the rig can run stuff from today at over 60 fps on maxed settings, that probably means I have more room for later games to still have smooth framerate, even if I might have to lower some settings.
2raysdiver@reddit
A 7500F scores 15-20% higher than the top-on-the-line i9-10900K from only five years ago. And people are still playing 1080p games on i7-6700K. So a 7500F is likely to last you quite a while for 1080p. I know a few people who are still quite happy with their RX 6600 GPUs for 1080p gaming. A 5060 TI or %)&) whould be more than adequate for 1080p for quite some time. Both have modern raytracing, which more games are starting to use. If anything, I might move up to a 9600X over the 7500F. At least in the US, a 7500F is about $180 and the 9600X is only $10 to $15 more.
That said, you never know what the future will hold as far as new games. Up until 18 months ago, I played 1080p on a gaming laptop from 2016 with an intel i7 6700HX and a GTX 970M. I think even a good entry level PC will handle 1080p for quite a while.
PriceOptimal9410@reddit (OP)
For my PC with a 1650, I had a 3600, and for the one I got after that, I had a 5600G, so I think a 7500F will serve me very well, especially since a lot of the games I play aren't very demanding (in relation to the modern 2020s behemoths). Though, maybe there will be some games I have trouble with, so that's something I will have to keep in mind. Of course, that's assuming that I even get those games, considering the vast amount of storage they often take up
VapeNGape@reddit
You would be fine assuming you never want a monitor upgrade. Even then a 1440p would probably be fine in a lot of games.
nanonan@reddit
My gaming setup is a 3300X with a 6600 at 1440p, you'll be more than fine.
Cloud_Snowfall@reddit
Rasterized you are probably good but you'll likely struggle with heavy RT/PT titles. CPU heavy games may bottleneck your system. Not a bad system, just set your expectations in order.
SickOfUrShite@reddit
tbh 7800x3d/9800x3d and 5060ti would be a better option at this range bc cpu matters most in 1080p right now
M4K4T4K@reddit
Probably around 3-4 years until the most punishing games struggle to play, 6-7 years until you start to struggle with many newly released games.
Of course, this is all if you only play newly released games only. Personally, I buy games because they sound interesting and I want to play them, and I don't really take year into consideration, so I end up with a pretty even spread of purchases ranging from 2012-2024(despite owning a 3090 that's more than capable) - so if you take that approach, then you can probably get a solid 10+ years out of a 5060ti/5070.
PriceOptimal9410@reddit (OP)
My taste in games tend to be even older; there's a bunch from the 2000s. Of course, that is probably BECAUSE I never had a dgpu for most of my life to play anything later than the early 2010s, so I didn't get the chance to enjoy any of those games. That being said, though, a truly immense amount of games have come out in the 2010s and the 2020s. I will be getting great use out of this GPU as long as it works!
iLessThan3Tu@reddit
I game on a 2070 super and 7800x3d. Plays everything with no issues at 1080.
Dr_Doom42@reddit
Turn down or set lowest all ray tracing stuff and you will be fine for most of games
NoMither@reddit
If you can afford the 5070 it may only have 12GB but at 1080P you'll be fine and even Hogwarts Legacy maxed out with ultra ray tracing + frame gen uses below 12GB at 1440P (source hardware unboxed) it's also a fairly big jump in performance vs. 5060 Ti 16GB.
Mixabuben@reddit
Yes... but why play on highest setting and go for 1080p..
Yinxld@reddit
I think so, not a professional, but I just built a PC, 7500f and 9060xt, atm using a +10year old TV as a monitor and I'm trying to figure out what monitor to buy and I think I'll still go 1080p, looks fine and I know I'll have 0 problems trying to play what I want for a while
randylush@reddit
why build or buy a new PC just to play at 1080p?
PriceOptimal9410@reddit (OP)
Yeah, plus my eyesight is kind of shitty, so I think I might not even be able to benefit fully from an upgrade to 1440p, lol. Would cost more and would reduce the frames I get, which is one thing I really care about
randylush@reddit
if your eyesight is shitty I don't understand why you are fretting over graphics quality lol
Little-Equinox@reddit
My eyes are pretty good, and instead of staying at 1080p, I just use my 65" TV😅
maokaby@reddit
Just bought 32" 1440p IPS, it's okay, and too expensive. Had to set 125% scale in windows though, and since then it's very comfortable for working and gaming.
Vinyl_Ritchie_@reddit
Currently running a 5060 on 1440 ultrawide, you'll be fine.
PriceOptimal9410@reddit (OP)
Hmmm, so a 5060 ti or a 5070 will be more than good enough for 1080p.... Makes sense. What kinda settings and games do you run?
Vinyl_Ritchie_@reddit
Pretty much maxxed out, depends on the game I guess. I have 32gb of DDR5 and a Samsung 9100 pro m.2 drive so there's no bottlenecks. Average FPS is 150-180 at 180hz. I looked at the ti and the 70 but the extra cost didn't make sense to me.
randylush@reddit
Why did everyone's X key get sticky in the past few years? It's "maxed". "Maxxed" is not a word.
Vinyl_Ritchie_@reddit
It is, I just invented it.
_Flight_of_icarus_@reddit
That combo will work great!
Just make sure you get the 16 GB version if you buy a 5060 Ti.
Snoo_75138@reddit
No.
If "modern" games like Oblivion Remastered released by a Billion dollar company, stutter despite using a 5090, then for you, no.
Your best bet is going with Indie, waiting years for the AAA game to hopefully be patched or staying with older games.
The gaming market is a joke right now, especially the AAA space.
Not to mention RTX will cripple you regardless.
HankThrill69420@reddit
Yeah pretty much.
7500f is always a solid pick in the budget space. L3 cache is important and that spec alone makes it better than any 8000 series CPU
Decent-Tumbleweed-65@reddit
Yes, but there is always like 2 games that will run at 59 fps instead of 60 with you enable path tracing or something. Garbage unoptimized titles anyways. You will be fine, a 3070 was released in 2020, is still decent today. Can't play everything at high settings anymore, but can play most games. Even a 2070, or 1070 would do fine today, just depends on the games and settings.
iesalnieks@reddit
The reason most benchmarks that aren't trying to max out the CPU start at 1440p is because if you go any lower the experience can get really blurry really quick, even with no upscaling. The vast majority of games these days use TAA for their anti aliasing, which is a fine technology, as long as you don't run it at 1080p. For every game that looks fine at 1080p there is another one that looks super blurry, for example Red Dead Redemption 2.
Secondly, very few people run their games at native resolution because the upscaled result is so good at 1440p and 4k. Upscaling to 1080 usually does not give such good results.
Both cards and the CPU will handle 1080p just fine, though even with the 5060ti I would consider moving to 1440p.
AmazinglyUltra@reddit
I cant vouch for 5060 ti but it's not far off from my 5070, my 13600k,4x8 320mhhz and rtx 5070 rig has been tearing through 1440p let alone 1080p. You'll be more than fine imo
theimponderablebeast@reddit
If all you are aiming for is 1080p/60fps you could even go down to an Intel B580 (if available in your country). But a 5060 or 5060Ti is definitely all you would need.
kloklon@reddit
highest settings usually aren't efficient. in many games medium or high looks almost as good as ultra while giving you way more fps. don't get caught in that "i absolutely need to max out all the settings" mindset it will only make you disappointed.
GuessWhoItsJosh@reddit
I'm rocking a 5070 w/ a 5700X3D on an ultrawide 200hz 1080p monitor and am very satisfied. Can run everything on high/ultra with frames much higher than 60. I tend to play a mix of older and some newer stuff. I think you will be quite happy with your setup for your play style.
9okm@reddit
Enjoy? Yea. At what settings? Nobody really knows.
PriceOptimal9410@reddit (OP)
For me, I'm okay with lowering settings till I get around 60 fps average
9okm@reddit
Then you should be fine for a very long time.
frolfer757@reddit
I'm on a similar settings and can easily run nearly any game at high settings on a stable 144fps at 1080p.
Only hiccups are CPU heavy games (like PoE2 which is notoriously bad for FPS, regardless of pc) that sometimes throw a million different effects on your screen. There I'll usually run medium settings and occasionally drop to 35-50fps for a few frames before going back to 100-120fps. Friend has a 9800x3d & 5070ti and even he occasionally dips below 100fps there.
Your setup will be fine for a very long time.
ChoriStorm@reddit
Don't listen to fomo users. I have a 5060 ti 16Gb paired with an i5 11400f (inferior CPU). It runs most games at 1440p and 90-120fps high settings, some even higher.
Silver_Scalez@reddit
Get a 24 inch monitor with high refresh and you will see awesome performance at 1080. You get into larger monitors and thats where the pixel density starts to make 1080 look rougher. On a 24 inch its very crisp and you will have high fps.
BongonimusKid@reddit
I have a 3070 and a 14th gen intel i7, 32gbs ddr4 and i havent found a game i cant play at 60fps or higher at 1080. You should be just find with that build for years
BongonimusKid@reddit
Also have a Legion laptop with a 4060 and it can also play all the games at 1080 and 1440p at high settings. Dont listen to comments. Most new GPUs will work great for years.
ime1em@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/18.html
According to this, yes at 1080p
guilgo@reddit
I ran a 4790k + GTX 1070 for over 10 years, you'll be fine. As the years go by, you lower the quality settings of the games you'll play. And it's not like you'll not be able to play something instantly from a month to the next. It's gonna be something you'll notice, something you'll feel i guess, until it reaches a point you'll realize you need to upgrade the gpu or the CPU+mobo+ram.
wolfiasty@reddit
5060ti/5070 will be more than enough for 1080p gaming. Worst case scenario in few years you will get details to very high, and/or turn on frames generation to get 60fps without any problem at all. But realistically we talk about way over 60 fps for next few years from that setup.
ZeisHauten@reddit
My PC is a tad bit weaker than yours but I am enjoying everything at 1080P with lots and lots of room to spare for 1440P. Hogwarts Legacy and BF6 runs so great. Everything that doesn't require RT will run at max settings.
clean_lines@reddit
7500f are great cpus. Currently paired with a rx 6800 and it plays anything that is thrown at it
Wakerwasthere@reddit
Vouching for the 5060ti.
As a 1080p gamer who recently upgraded from a GTX 1660 Super, the 5060ti does the job well.
Able to play Cyberpunk 2077 on ultra settings and path tracing with DLSS Quality and Framegen X2. Without frame gen, I can get about 50 FPS. Honestly, while I hate how Nvidia advertises framegen as real performance, I don't have a problem gaming with it enabled. I barely notice any input lag and having a blast experiencing the world.
Expedition 33 and RDR 2 also run great with the highest settings. Not a single regret with getting the 5060ti, as long as its the 16 GB version.
PriceOptimal9410@reddit (OP)
5060 ti can run cyberpunk with ultra settings and RT on at 50 fps without framegen? Now that's superb. Thank you for sharing that. My 1650 couldn't run the game at more than 20 fps on the lowest settings. Honestly, knowing that a card like the 5060ti can run a game as imposing as Cyberpunk 2077, and at max settings at that, puts me more at ease. That's more room for me to play almost any game I want (aside from the true monsters of the last two years, I suppose). And I don't mind lowering settings to get a decent framerate I can bear, but being able to run ultra 60 fps on a lot of games already is a pretty big selling point for me
madskee@reddit
7500f or 9600x + 5070 @ 1440p resu
strategicgrills@reddit
The thing is, the overall state of PC technology, although we have had many problems in recent years, has advanced to the point that the midrange cards like the 5060 and 9060 XT are not marginal 1440p cards any more. They're perfectly respectable 1440p cards. Some may argue that's only true for the 16 GB variants and I shan't argue with them, I'll just say if you're looking forward to the future I consider them to be correct.
Even the humble B580 does surprisingly well at 1440p. I'd still take either an Nvidia or AMD card if at all possible, but it's nice to know people who do not have that kind of budget have a viable option.
Now I'm not saying everyone should play at 1440p, just that it's now in easy reach. Even the 8GB variants of the current midrange cards handle it really well provided you aren't playing a game that exhausts the card's memory. If I were going to have an 8 GB card for the foreseeable future I'd stick to 1080p just to extend the enjoyment of it as much as possible.
Pav3LuS@reddit
No.
Current_Finding_4066@reddit
You will be fine for quite some time
9okm@reddit
No, it won’t.