Would be nice if software “engineers” didn’t make shitty applications that required half of your cpu, ram, gpu, and disk memory. The fact that web browsers need gigabytes of ram to display some text and images controlled through a bit of markdown and javascript is criminal.
Paying $300 for 4 year old tech. A xx50 card cosplaying at that.
Smh back in my day i would pay $400s for xx70 card.
They shifted all gpus down 2 tiers. But because the casuals have no idea. They just keep buying.
Ps. See you next year when an rtx 5050 75 watt card comes and performs the same as the 4060. Leading to you people praising the shit out of it. (It's actually just a rearragned 4060 power limited to 60%.)
You lose 10% fps on any rtx 4xxx if you set power limit to 60%. 5% less fps for 70% power consumption.
Product shifting. All of those had atleast 2 cards below them. 4060 is the most barebones you can buy. The 160mm^2 4060 is closer to the 118mm^2 chip gt 740 than gtx 760. I mean the actual computer part inside.
Actual size because of the giant cooler... Well it was probably planned to use even more power than it did. You can power limit down to 75 watts / 60% and only lose 10% fps. It's already overclocked waaay above what it needs.
Buy yeah imagine paying $300 for a gt 740. Or $220 in 2014 money.
I wrote about a bunch of stuff. But you say you don't care about tech so it doesn't matter. Regardless the point of bringing in power consumption is because it's the same as a car engine and gasoline usage. Two cars but the one that uses more gasoline is faster. That just means they are the same car. The context being they are suppose to get cheaper but the company pretends that instead of selling you a the same car. They are selling something better for 60% more mokeu.
Eh well i was wrong about 1 thing. It's the successor to the gtx 750. Not gt 740. The gtx 750 was $120 dollars by the way.
I don't get it. If you don't mind paying $300 for a gtx 750. Why not just get a console instead? You are going to get an even better experience. Generally atleast. No worrying about drivers or nothing.
But if you are using it for work. Then either the company owns everything. Or they just paid you. So yeah indeed why would you care something costs double what it did years ago. Work pays for it.
Spiderman has unlocked modes which run at around 90fps by the way. I m sure there will be more like that now that frame gen is becoming common.
I hope US denies their dumbass proposal requesting a powerplant for an AI center, because holy fuck, stuck sucking and fucking every company that even slightly helps your market.
Although I'm disappointed it took AI of all the reasons to recommission a nuclear power plant, and that plant won't be on the national grid, but at least they're using it, and it's just just sitting there rotting. Keeping them in good shape makes recommissioning cheaper and safer.
We probly close to what our current understanding of the technology can achive. The diffrencec bettwen latest card series has been miniscule compared to the past
Me when I need the new RTX 50000 gigabeast Gaming™ Ultra™ Ti in order to play the newest Call of Duty™ because the developers had to include 500 gigabytes of 32k texture files in an uncompressed format and the sweat beads on Man McMan's forehead (the protagonist, he's so cool and original) get to have raytraced reflections as he talks about how great the US Marine Corps is
It's McMaan, actually. People often disregard his tragic immigration backstory probably because you can skip those cutscenes but imo he's one of the better fleshed-out characters in the franchise, along with Trayvon.
Only if Nvidia becomes uninterested in pushing GPUs due to AI and uses architectures that isn't great for gaming/3d. But for now I am pretty sure they will continue to be ahead for some time. AMD might close the gap a little but it's not done yet. Intel is far behind on all levels and they might even completely die. I hope not as competition is good for consumers but it might be already too late
Imagine changing your GPU or CPU more frequently than decade. I still use my shitty GTX 1050Ti and I5 of 9th generation. Then again, I avoid the AAA shit.
Downvote me if you want but I’m gonna drop an important fact that Reddit keeps saying isn’t true, 90% of all software is built and designed for Nvidia hardware. Not having an Nvidia card makes rendering things wonky, new triple A games wonky, it makes indie games wonky, and it makes many emulators straight up unplayable. I don’t care how good price to performance is if I can’t get my small passion project software made by 5 people to work right.
Okay, but why is my AMD card working fine then? I dont use emulators or play indie games. But I feel like 90% of people don’t use them either. But I have had no issues with AAA games. And esports games that I play. And no issues with simracing games either.
AMD is and has always been a meme. I fell for it once but won't get fooled again. I know the internet love their "muh cheap AND better" alternatives but there's a reason the status quo is the status quo, it fuckin works anytime it's supposed to
AMD plans to focus on mid tier GPUs for now so nVidia will still have the edge in best performance but AMD already has and probably will continue to have the edge in bang for your buck mid and low tier GPUs
brother...gpu tech aint stagnating. nvdia just don't care about gamers because theres literally CEOs begging them to sell gpu at $30k a piece, while gamers cry about their $700 70s series.
Almost certainly not. I'm hoping Intel can come in with something worthwhile. That is, if they dont kill it to save costs considering their recent fuck-up.
InsertaGoodName@reddit
Would be nice if software “engineers” didn’t make shitty applications that required half of your cpu, ram, gpu, and disk memory. The fact that web browsers need gigabytes of ram to display some text and images controlled through a bit of markdown and javascript is criminal.
Fire2xdxd@reddit
should just use firefox.
InsertaGoodName@reddit
changing the browser alleviates the problem by at most 10%, the issue is fundamental to how the modern web is developed.
Fire2xdxd@reddit
True tbh but at least firefox is a good browser overall
Borno11050@reddit
(Modern) Web devs would place divs within divs within divs like 30 times nesting just to make a button. What else to expect?
noobmasterdong69@reddit
i dont think the modern web dev even knows what markdown is which is why theyre so massive
rokyu_@reddit
I pray everyday to nvidia's downfall and they start to sell actual good affordable consumer hardware
BalefulRemedy@reddit
They do? You can't afford 4060?
rokyu_@reddit
if you think a 4060 is "good" we don't share the same knowledge
BalefulRemedy@reddit
But it is? Tf you need 4090 for? 1 playthrough of cyberpunk?
tukatu0@reddit
Paying $300 for 4 year old tech. A xx50 card cosplaying at that.
Smh back in my day i would pay $400s for xx70 card.
They shifted all gpus down 2 tiers. But because the casuals have no idea. They just keep buying.
Ps. See you next year when an rtx 5050 75 watt card comes and performs the same as the 4060. Leading to you people praising the shit out of it. (It's actually just a rearragned 4060 power limited to 60%.)
You lose 10% fps on any rtx 4xxx if you set power limit to 60%. 5% less fps for 70% power consumption.
BalefulRemedy@reddit
Idk man, I don't care about power consumption and tech. It works fantastic for me and that's all what matters.
tukatu0@reddit
You pay more for less.
Meaningless conversation though. Already too many people bought. Guess they enjoy paying .... Well whatever. A console is unironically a better choice
BalefulRemedy@reddit
I bought it for same price I paid for 760,1060 and gt7600 before them
tukatu0@reddit
Product shifting. All of those had atleast 2 cards below them. 4060 is the most barebones you can buy. The 160mm^2 4060 is closer to the 118mm^2 chip gt 740 than gtx 760. I mean the actual computer part inside.
Actual size because of the giant cooler... Well it was probably planned to use even more power than it did. You can power limit down to 75 watts / 60% and only lose 10% fps. It's already overclocked waaay above what it needs.
Buy yeah imagine paying $300 for a gt 740. Or $220 in 2014 money.
I wrote about a bunch of stuff. But you say you don't care about tech so it doesn't matter. Regardless the point of bringing in power consumption is because it's the same as a car engine and gasoline usage. Two cars but the one that uses more gasoline is faster. That just means they are the same car. The context being they are suppose to get cheaper but the company pretends that instead of selling you a the same car. They are selling something better for 60% more mokeu.
BalefulRemedy@reddit
I mean, they give me the same experiencein games. So yeah, tech isn't something most consumers care about.
tukatu0@reddit
Eh well i was wrong about 1 thing. It's the successor to the gtx 750. Not gt 740. The gtx 750 was $120 dollars by the way.
I don't get it. If you don't mind paying $300 for a gtx 750. Why not just get a console instead? You are going to get an even better experience. Generally atleast. No worrying about drivers or nothing.
BalefulRemedy@reddit
I work on pc and i prefer mouse/keyboard + i have 144hz monitors, why would i downgrade to 30 fps gamepad experience?
tukatu0@reddit
Like i said. Esports is 120hz on console
But if you are using it for work. Then either the company owns everything. Or they just paid you. So yeah indeed why would you care something costs double what it did years ago. Work pays for it.
Spiderman has unlocked modes which run at around 90fps by the way. I m sure there will be more like that now that frame gen is becoming common.
BalefulRemedy@reddit
Nah, i work hybrid so i have my pc home and laptop at work. As for next upgrades yeah, i'll sell my shit in two years\~ and buy 5060 for 50-100$ diff
pegasusCK@reddit
Wtf is this elitist bullshit.
4060 = 3070 and either will render majority of AAA titles at 120+ fps at 1080p on high settings without extra bullshit turned on.
You don't need 4k ultra 16x anisotropic filter and max raytracing to enjoy a good game. High settings at 1080p are more than enough.
Blisterexe@reddit
And the equally priced amd card is significantly better
BalefulRemedy@reddit
Idk frame gen and dlss are better then fsr rn. But yeah, idk what op yapping about.
Loliver69@reddit
They won't most likely, nvidia is making most of their money via AI. They could stop making gaming cards and it wouldn't matter too greatly to them.
Metrix145@reddit
I hope US denies their dumbass proposal requesting a powerplant for an AI center, because holy fuck, stuck sucking and fucking every company that even slightly helps your market.
Rendered_Pixels@reddit
Although I'm disappointed it took AI of all the reasons to recommission a nuclear power plant, and that plant won't be on the national grid, but at least they're using it, and it's just just sitting there rotting. Keeping them in good shape makes recommissioning cheaper and safer.
Metrix145@reddit
I am not a big fan of large corporations having their own power grid.
Substantial-Cat2896@reddit
We probly close to what our current understanding of the technology can achive. The diffrencec bettwen latest card series has been miniscule compared to the past
Uncommonality@reddit
Me when I need the new RTX 50000 gigabeast Gaming™ Ultra™ Ti in order to play the newest Call of Duty™ because the developers had to include 500 gigabytes of 32k texture files in an uncompressed format and the sweat beads on Man McMan's forehead (the protagonist, he's so cool and original) get to have raytraced reflections as he talks about how great the US Marine Corps is
Blurg_BPM@reddit
500GB? That's low for a cod game its more around 3000PB
Mizznimal@reddit
No thats the children’s homework folder
baphometromance@reddit
Wow you sure do help a lot of kids do their homework
DiscountParmesan@reddit
you forgot that shaders preloading bullshit
letsgoiowa@reddit
You mean you're mad that shaders are precompiled? Bruh people are mad when other games DONT do that because it leads to horrific stutters
Smh
DiscountParmesan@reddit
I'm mad because some games can pull it off and other gamer games prefer to waste your time rather than optimize their shit
Sen-oh@reddit
It's him. It's John Duty
dm_me_tittiess@reddit
I still use my 1050ti.
Uncommonality@reddit
congrats, this was sarcasm though.
dm_me_tittiess@reddit
Mine wasn't
AFrenchLondoner@reddit
I'm playing DK Country on my emulator and having more fun than you
Uncommonality@reddit
Breaking news: redditor too highly regarded to register sarcasm
lucasthebr2121@reddit
A redditor with a frontal cortex smaller than the human average*
Pass_us_the_salt@reddit
So, just a redditor.
lucasthebr2121@reddit
Yes
YoungDiscord@reddit
...That you play on a tiny 720p display because you blew all your cash on the RTX
lucasthebr2121@reddit
And also because if you play on a higher resolution it lag the game for some unknown reason
Spiderpiggie@reddit
That’s the proprietary 3rd party anti cheating and pirating service, which cuts game performance in half for no discernible reason
400asa@reddit
It's McMaan, actually. People often disregard his tragic immigration backstory probably because you can skip those cutscenes but imo he's one of the better fleshed-out characters in the franchise, along with Trayvon.
gloumii@reddit
Only if Nvidia becomes uninterested in pushing GPUs due to AI and uses architectures that isn't great for gaming/3d. But for now I am pretty sure they will continue to be ahead for some time. AMD might close the gap a little but it's not done yet. Intel is far behind on all levels and they might even completely die. I hope not as competition is good for consumers but it might be already too late
wololowhat@reddit
They won't because nepotism, nvdia and AMD CEOs are cousins
GardenofSalvation@reddit
They are like 2 families away I doubt they even speak often.
Acou@reddit
they're both CEOs of major companies in the same industry i'm sure they ring each other now and then
Explorer_the_No-life@reddit
Imagine changing your GPU or CPU more frequently than decade. I still use my shitty GTX 1050Ti and I5 of 9th generation. Then again, I avoid the AAA shit.
MindGoblin@reddit
There are like a total of 2 "triple-A" games worth playing per year these days, and the PC ports are usually janky as fuck and barely optimised.
spookybaker@reddit
No
ph16053@reddit
Downvote me if you want but I’m gonna drop an important fact that Reddit keeps saying isn’t true, 90% of all software is built and designed for Nvidia hardware. Not having an Nvidia card makes rendering things wonky, new triple A games wonky, it makes indie games wonky, and it makes many emulators straight up unplayable. I don’t care how good price to performance is if I can’t get my small passion project software made by 5 people to work right.
Accomplished_Bet_781@reddit
Okay, but why is my AMD card working fine then? I dont use emulators or play indie games. But I feel like 90% of people don’t use them either. But I have had no issues with AAA games. And esports games that I play. And no issues with simracing games either.
DeathSabre7@reddit
Don't forget the backrooms under the table greens (from in and nv) for board members like asus, gigabyte etc. to not have amd stuff in their lineups
basti329@reddit
Bro the GPU market is thriving.
Too many AI cards etc giving nvidia more cash than ever and AMD is now getting into the game as well.
We are fucked and i doubt we get good consumer products that aren't priced to hell any time soon.
ty6vx2@reddit
AMD is and has always been a meme. I fell for it once but won't get fooled again. I know the internet love their "muh cheap AND better" alternatives but there's a reason the status quo is the status quo, it fuckin works anytime it's supposed to
MLGQu1ckSc0p3r@reddit
AMD plans to focus on mid tier GPUs for now so nVidia will still have the edge in best performance but AMD already has and probably will continue to have the edge in bang for your buck mid and low tier GPUs
Explorer_the_No-life@reddit
This is acceptable, I guess.
Fangslash@reddit
brother...gpu tech aint stagnating. nvdia just don't care about gamers because theres literally CEOs begging them to sell gpu at $30k a piece, while gamers cry about their $700 70s series.
iDontRagequit@reddit
Who the fuck is out here still buying new PC hardware?
is yalls VR porn really that hard to run?
De_Dominator69@reddit
Clearly we need intel to shift to making GPUs and replace Nvidia as the go to just to make it cyclical
leutwin@reddit
Honestly I'm looking forward to a 3 way standoff where and, Intel, and Nvidia all produce competitive gpus and cpus
TheSecondTraitor@reddit
No. Nvidia is expensive precisely because they know they are years ahead of competitors and so they can get away with it.
Connect_Hospital_270@reddit
"Will AMD and Intel save us?"
Narrator: No.
ThatTysonKid@reddit
Almost certainly not. I'm hoping Intel can come in with something worthwhile. That is, if they dont kill it to save costs considering their recent fuck-up.
PhgAH@reddit
Nvidia ain't stagnating, they just don't give a fuck about Gaming GPU. Their AI card that sell for 20x the Gaming card with a 70% profit margin.
BBtheboy@reddit
No. next question