Nvidia CEO: "We can't do computer graphics anymore without artificial intelligence" | Jensen Huang champions AI upscaling in gaming, but players fear a hardware divide
Posted by chrisdh79@reddit | hardware | View on Reddit | 405 comments
From-UoM@reddit
Knowing Nvidia they will add something again on the 50 series. It will be hated at first, then everyone else will copy it and it will become accepted.
Massive_Parsley_5000@reddit
My guess is NV might push hardware denoising for the 50 series.
That would effectively bury AMD's recent announcement of stapling more of their RT cores into rdna 4....just look at games like Alan Wake 2 and Star Trek Outlaws....denoising adds a massive perf cost to everything RT related. Having dedicated HW to do it would likely give NV a full generation's lead ahead of AMD again.
Fantastic_Start_2856@reddit
Ray Reconstruction is already hardware accelerated denoising.
Strazdas1@reddit
No, its a mix of software and hardware denoising.
Fantastic_Start_2856@reddit
No. It’s pure hardware. It doesn’t use hand-tuned (aka software based) denoising algorithms
“Ray Reconstruction, is part of an enhanced AI-powered neural renderer that improves ray-traced image quality for all GeForce RTX GPUs by replacing hand-tuned denoisers with an NVIDIA supercomputer-trained AI network that generates higher-quality pixels in between sampled rays.”
https://www.nvidia.com/en-eu/geforce/news/nvidia-dlss-3-5-ray-reconstruction/
basseng@reddit
Hardware accelerated is still an order of magnitude slower than specific hardware (as in an ASIC). Just look to NVENC for an example of this in action.
2tos@reddit
IMO these techs arent generational lead, i need raw power, don't care about dlshit os rt fsr, i just want to play the game and thats it, if nvidia comes with rtx 5060 350$ with all these techs and AMD pulls its 8600xt with the same performance for 275 - 290 i dont even need to think in wich to buy
Strazdas1@reddit
What are you going to do with raw power?
I guess you also dont care about tesselation, shaders, LODs, etc?
conquer69@reddit
Generational raw performance improvements are decreasing. It's harder and more expensive than ever before.
But that is raw power. RT performance has increased drastically. It's weird that you get exactly what you said you wanted but "don't care".
nisaaru@reddit
80% market share doesn't mean >3070/4070 GPUs with perhaps the required performance for dynamic AI assets. Without consoles providing the base functionality to do this it makes no market sense anyway.
Strazdas1@reddit
Good thing those GPUs are not the requirement.
itsjust_khris@reddit
Why would that happen as long as AMD has consoles? Then such a game could only be targeted at recent Nvidia GPUs on PC, which isn’t a feasible market for anything with the resources necessary to use all these cutting edge techniques in the first place.
Strazdas1@reddit
Consoles are getting increasingly irrelevant. Xbox Series X sold a third of what Xbox 360 sold and half of what Xbox One sold. Same trend for Playstation consoles as well.
Akait0@reddit
What you're describing is only feasible for a game or a couple of them. No dev will willingly limit their potential customers, so they will make their games to run on the maximum amount of hardware they can. Nvidia would bankrupt itself if it has to pay every single game studio, and that's not even taking into account all the studios that would never take their money because they are either own by Microsoft/Sony and would never stop doing games for the Xbox/PS5, which run on AMD hardware, or simply make their money from consoles.
Even games like CP2077 end up implementing AMD software (although later) simply because there is money to be made from that, even though they absolutely get the bag from Nvidia to be a tech demo for their DLSS/Raytracing.
And even if Nvidia had 99% market share, it wouldn't happen, because even their own older GPUs wouldn't be able to run it, maybe not even the new ones from the 60 series, which is the most commonly owned according to steam.
ThankGodImBipolar@reddit
Developers would be happy to cut their customer base by 20% if they thought that the extra features they added would generate 25% more sales within the remaining 80%. That’s just math. Moreover, they wouldn’t have to deal with or worry about how the game runs on AMD cards. It seems like a win-win to me.
phrstbrn@reddit
Majority of big budget games these days are cross platform games and majority of sales are still consoles. The situation where developer purposefully guts the PC port is gutted to the point where it runs worse than console version is unlikely. Everything so far has been optional because consoles can't run this stuff.
The games which are PC exclusive are generally niche or aren't graphically intensive games anyways. The number of PC exclusive games that are using state of the art ray-tracing can probably be counted on one hand (if you can name more than 5, it's still a very small number relative to number of cross platform games that have extra eye candy added for the PC port)
Strazdas1@reddit
yes
no
Incorrect. Many games have mandatory RT despite it causing significant performance issues on consoles. Its simply saving tons of developement time to do this.
They are doing this increasingly less so, just like any other tech in videogames.
The opposite is usually true.
ProfessionalPrincipa@reddit
Yeah I don't know what crack that guy is on. Games from big developers are increasingly trying to get on to as many platforms as they can to try and recoup costs.
Wide market console titles are headed this way. Exclusivity agreements are starting to turn into millstones.
Even indie games get ported to as many platforms as possible including mobile.
TinkatonSmash@reddit
The problem with that is consoles. The PS5 uses all AMD hardware. Chances are they will stick with AMD for next gen as well. Unless we see a huge shift towards PC in the coming years, most game devs will always make sure their games can run on console first and foremost.
frumply@reddit
The console divide will keep things from becoming a nvidia monopoly, while still allowing nvda to use their AI arm to continue and make huge strides. I'm cool with being several years behind (I was on a 1070 till 2023 and probably won't plan on upgrading from my 3070 for a while) and would much rather they keep making cool shit. Also a nonzero chance that the next nintendo console will still take advantage of the nvidia stuff in a limited manner, kind of like what it appears the new switch may be doing.
Aggressive-Bus-1972@reddit
Denoising and RTX won't make people pay 80% of people pay 25% more
Some people will just wait 120% longer to upgrade
ThankGodImBipolar@reddit
You have grossly misunderstood my comment. I didn’t advocate for either upgrading or raising prices at all.
Strazdas1@reddit
Thats like saying no game will limit their potential by including ray tracing because only 2000 series had ray tracing capability. Except, a ton of them did and it was fine.
Geohie@reddit
so I guess console exclusives don't exist
They don't need every game studio they just need a few 'Nvidia exclusives'. If a Nvidia GPU can run all pc games but AMD gpus can't- even if its only a few dozen games, people will automatically see the Nvidia as a major value add. It's why the PS5 won against Xbox series X- all of Xbox was on PC but PS5 had exclusives.
pokerface_86@reddit
this generation? there’s barely any
Geohie@reddit
Switch is a console btw
pokerface_86@reddit
switch is last generation btw
Geohie@reddit
It's still current generation by definition, as there is no competitor to the Switch out yet.
If we're talking about power, the Switch is 2 gens ago so you're wrong either way.
KristinnK@reddit
That's not at all how home video game console generations are defined. The Nintendo Switch is indeed classified as an eighth generation console, while the current generation is the ninth generation.
However, it is true that the Switch is a bit of a special case, being released midway trough the life of the eighth generation, as a rushed-out replacement for the commercially failed Wii U. You could conceivably call it a eighth-and-a-half generation console. But it certainly is not current generation.
vanBraunscher@reddit
This strikes me as a very... charitable take.
It took them a while, but triple A game devs have finally realised that they are benefitting from rapidly increasing hardware demands as well, so they can skimp on optimisation work even more, in the hope that the customer will resort to throwing more and more raw power (hence money) at the problem just to hit the same performance targets. And inefficient code is quickly produced code, so there's a massive monetary incentive.
And it seems to work. When Todd Howard smugly advised Starfield players that it is time to upgrade their hardware, because they started questioning why his very modestly looking and technically conservative game required a surprisingly amount of brunt, the pushback was minimal and it was clear that this ship has sailed. And this is not a boutique product à la Crysis situation, but Bethesda we're talking about, who consider their possible target audience to be each and every (barely) sentient creature on the planet, until even your Grandma will start a youtube channel about it
And that's only one of the more prominent recent examples among many, overall optimisation efforts in the last few years have become deplorable. It's not a baseless claim that publishers are heavily banking on the expectation that upscaling tech and consumers being enthralled by nvidias marketing will do their job for them.
So if NVIDIA trots out yet another piece of silicon-devouring gimmickry, I'd be not so sure whether the the software side of the industry could even be bothered to feign any concern.
Ok, and that's just downright naive. Even right now people with cards in higher price brackets than the 60 series are unironically claiming that having to set their settings to medium, upscaling from 1080p to to 2k and stomaching fps which could have been considered the bare minimum a decade ago is a totally normal phenomenon, but it's all sooooo worth it because look at the proprietary tech gimmick and what it is doing to them puddle reflections.
The market has swallowed the "if it's too choppy, your wallet was too weak" narrative with gusto, and keeps happily signalling that there'd be still room for more.
itsjust_khris@reddit
There’s a big difference between your examples of poor optimization or people legitimately running VERY old PCs and games requiring extremely recent Nvidia gpus for fundamentally displaying the game as described in the top comment. No game dev is going to completely cut out consoles and everybody under the latest Nvidia generation. That makes zero sense and has not happened.
f1rstx@reddit
BMW says otherwise, it is RTGI by default that sold very well. It’s sad that many dev still forced to limit themselves to support outdated hardware like AMD RX7000 cards. But well made game with RT will sell well anyways
From-UoM@reddit
Wouldnt that be DLSS Ray Reconstruction? Though that runs on the tensor cores.
DLSS 4 is almost certainly coming with RTX 50. So its anyone guess what it will be. Nobody knew about Framegen till the actual official announcement.
bubblesort33@reddit
They already showed off the texture compression stuff. Maybe that's related. DLSS4 or whatever version is next, could generation 2 or 3 frames. whatever is needed to hit your monitor's refresh rate.
Massive_Parsley_5000@reddit
Ray reconstruction is nice, but isn't perfect (see numerous DF, GN, and HUB videos on the quality), and comes at a performance cost as well. Real hw denoising would be significantly faster, and higher quality as well.
Qesa@reddit
But what would "real hardware denoising" look like? Are you implying some dedicated denoiser core akin to a ROP or RT core? Those two are both very mature algorithms that standard SIMD shaders don't handle well. While denoising is an open problem encoding it into a fixed logic block is probably a bad idea, because then some new algorithm could be invented and suddenly those ASICs are wasted sand. And if AI ends up being a better approach than something algorithmic it's already hardware accelerated anyway.
basseng@reddit
I would imagine a small portion of the GPU would essentially be a denoising ASIC. Hell it might even be its own dedicated chip.
It would be a specific hardware implementation of their best denoising algorithm at the time of the chip design, perhaps enhanced for due to the speed benefits the ASIC would bring.
So it'd be NVIDIA Denoise 1.2a, and you'd have to wait until next gen for the 1.3b version.
There's no way you'd waste sand, the speed benefits alone over the dedicated hardware would be an order of magnitude more than what could be achieved on any software implementation.
Also nothing would stop Nvidia from combining techniques if there was some kind of miraculous breakthrough, you'd basically get a 2 pass system where the AI denoiser would have a vastly easier (and thus faster) time of applying it's magic thanks to the hardware denoiser already managing the broad strokes.
jasswolf@reddit
The performance cost is there in Star Wars Outlaws because the game also cranks its RT settings to meet its minimum requirements. Outside of that, it's just a slightly more expensive version of DLSS, one that's design with full RT (aka path tracing) in mind.
This is a short term problem, and your solution is equally short term.
OutlandishnessOk11@reddit
it is mostly there with the latest patch from games that implemented ray reconstruction. Cyberpunk added DLAA support at 1440p path tracing it no longer has that oily look, Star wars outlaws looks a lot better since last patch. This is turning into a massive advantage for Nvidia in games that rely on denoising, more that DLSS vs FSR.
From-UoM@reddit
Its hit or miss at the moment i agree. But like with other models with training and learning it will improve.
There is no limit to how much all functions of DLSS can improve especially the more aggressive modes like Ultra Performance and Performance.
Typical-Yogurt-1992@reddit
I think noise reduction has been around since before DLSS3. Quake II RTX, released in March 2019, also uses noise reduction for ray tracing. Frame generation has also been on chips in high-end TVs for a long time. What made DLSS FG unique was that it used an optical flow accelerator and a larger L2 cache to achieve high-quality frame generation with low latency.
If the capacity of the L2 cache increases further or the performance of the optical flow accelerator improves, frame generation will not be limited to one frame but will extend to several frames. The performance of the Tensor Core is also continuing to improve. Eventually it will output higher quality images than native.
WhiskasTheCat@reddit
Star Trek Outlaws? Link me the steam page, I want to play that!
peakbuttystuff@reddit
GUL DUKAT DID NOTHING WRONG
Seref15@reddit
Its an entire game where you just play a Ferengi dodging the Federation space cops
Massive_Parsley_5000@reddit
:p oops
Quaxi_@reddit
Isn't DLSS 3.5 ray reconstruction basically an end-to-end hardware tracing-to-denoising pipeline?
basseng@reddit
No it's software mixed with hardware acceleration, so it's still a software algorithm running on general purpose compute units, even if it is accelerated by more specialized hardware for chunks of it.
So it's like the GPU cores (cuda cores) are specialized hardware acceleration (compared to a CPU), and the tensor cores within them are just even more specialized - but still not specific hardware for software to run on.
What I suspect nvidia might do is add a denoising ASIC, an fixed specific algorithm literally baked into a chip, it can only run that algorithm nothing more - giving up general (even specialized) use for vastly improved speed at 1 and only 1 thing.
Think hardware video encoding which only works on specific supported codecs, such as NVENC can encode to H.264, HEVC, and AV1, but only those and usually with limited feature support, and each of those is actually their own specific region of the chip (at least partly).
ASICs are an order of magnitude faster, so even if the ASIC only took control of a portion of that pipeline it would represent a significant performance increase - I'd wager an immediate 50% performance or quality gain (or some split of both).
ExpletiveDeletedYou@reddit
So you upscale then denoise the upscaled image?
Is dissimilar even bad for noise?
Enigm4@reddit
I doubt they will push multiple groundbreaking technologies when they are already comfortably ahead of AMD. If anything I think we will just see a general performance increase due to a big increase in VRAM bandwidth and if anything they will probably tack on an additional interpolated frame in their framegen tech.
ResponsibleJudge3172@reddit
Has not stopped them innovating all this time with 90% of the market
No_Share6895@reddit
i mean... this one would be a good thing imo.
TheAgentOfTheNine@reddit
Like it happened with hairworks, physX and all the new features that were critical to gaming before.
Strazdas1@reddit
stuff like hair/cloth physics are just taken for granted nowadays and are in every major game.
TheAgentOfTheNine@reddit
They aren't run on the GPU using nvidia's proprietary software suite. Just like you don't need a superfancy fpga in your monitor to have variable refresh rates despite how hard nvidia pushed manufacturers to not comply with the DP standard and instead make exclusively g-sync models.
randomkidlol@reddit
dont forget gameworks, ansel, gsync, etc. all proprietary crap that died once they could no longer squeeze consumers for temporary gimmicks
Strazdas1@reddit
Hey man Ansel is something i still use but it needs developer to support it in the game.
From-UoM@reddit
Physx is used in so many games.
https://www.pcgamingwiki.com/wiki/List_of_games_that_support_Nvidia_PhysX
Its still being used a lot. Recent Games like Black Myth Wukong and Shin Megami Tensei V Vengeance both used it.
Enigm4@reddit
I'm still not thrilled about having layer upon layer upon layer with guesswork algorithms. First we get visual bugs from VRS, then RT de-noising, then ray reconstruction (and probably more RT tech I am not even aware of), then we get another round of visual bugs with up-scaling, then we finally get another round of bugs with frame generation. Did I miss anything?
All in all, most of the image looks great, but there are almost always small visual artifacts from one technology or another, especially when it comes to small details. It gets very noticeable after a while.
Plank_With_A_Nail_In@reddit
You don't really get a choice lol. I love people telling the market leading company that they are doing it wrong.
Enigm4@reddit
Intel was a market leader once, until they weren't. I will say whatever the fuck my opinion is.
Strazdas1@reddit
and Intel still isnt listening to your opinion. So nothing really changed in that regard.
Enigm4@reddit
At least I am not a sheep that don't have any opinions and follow blindly.
Strazdas1@reddit
I always have options because i buy the best hardware for my use case without brand loyalty.
skinlo@reddit
What a weird attitude. You don't need to defend trillion dollar companies>
ProfessionalPrincipa@reddit
Layering all of these lossy steps on top of each other introduces subtle errors along the way. I guess sorta like generational loss with analog tape copying. I'm not a fan of it regardless of the marketing hype.
-WingsForLife-@reddit
You're talking as if traditional game rendering methods have no errors themselves.
conquer69@reddit
RR converts this into a single step. It's a fantastic optimization and why it performs slightly faster while improving image quality.
NaiveFroog@reddit
You are dismissing probability theory and calling it "guess work", when it is one of the most important foundations of modern science. There's no reason to not believe such features will evolve to a point where they are indistinguishable to human eyes. And the potential it enables is something brute forcing will never achieve.
Enigm4@reddit
!RemindMe 10 years
witherscarf@reddit
While I understand where you're coming from, that's simply the nature of the game with real time rendering. It's always, always been full of guesswork and shortcuts and optimisations. This isn't specifically new.
Enigm4@reddit
I'm just really not a fan of temporal artifacts. That is something we are getting way too much of now with upscaling, frame gen and de-noising. All three of them are adding each of their own temporal artifacts.
Boomy_Beatle@reddit
The Apple strat.
aahmyu@reddit
Not really. Apple removes features. Not add new ones.
sean0883@reddit
Or. They add 4 year old features the competition has always had, allows you to do something extra but meaningless with it, and calls it the next greatest innovation in tech.
Grodd@reddit
A common phrase I've heard about immerging tech: "I can't wait for this to get some traction once Apple invents it."
pattymcfly@reddit
Great example is contactless payment and/or chip+pin adoption in the US. The rest of the world used contactless credit cards for like 15 years and there was 0 adoption here in the US. After Apple Pay launched is took off like crazy and now the vast majority of sales terminals take contactless payments.
Strazdas1@reddit
To be fair, you still use magnetic strips for your credit cards, which is pretty much banned anywhere else due to how unsafe that is. You still use checks. US is extremely behind in financial tech.
pattymcfly@reddit
Totally agree
qsqh@reddit
out of curiosity, for how long you have had contactless credit cards in the us?
gumol@reddit
I remember that I only got upgraded to a chip credit card around 2015. US banking system is a an outdated joke. I just paid 30 bucks to send a wire transfer last week.
I got the Apple Pay iPhone right after it was released, I couldn’t use it the states because nobody had contactless terminals. But when I traveled to my eastern european home country right after, I could use it basically everywhere.
qsqh@reddit
I remember using contactless credit card here in brazil around ~2010 already, and it was accepted pretty much everywhere, and since you mentioned wire transfers, we get that for free+instant as well, its weird how we are so much behind in certain things, but for some reason our banking system is top tier lol
pattymcfly@reddit
Only about the last 7 years. Maybe 10. Definitely not before that.
sean0883@reddit
And you're only talking about major adoption. We had it 15 years ago. I remember getting the card with the new tech, and it went exactly like you said: nobody supported it, so bank removed it from their cards, only recently reintroducing it. It's so very much still not used in the US (even if finally widely supported) that when I went to the UK for the first time about a year ago I had to finally setup Google Pay.
It's not that I can't use it in the US. It's that it's still not at 100% support, so I use the method that is.
jamvanderloeff@reddit
It was well before that, the big three card companies all had EMV compatible contactless cards generally available in 2008, and trials back to ~2003 (including built into phones)
pattymcfly@reddit
Sure, but the vast majority of cards did not have the NFC chips in them and the vast majority of vendors did not have the right PoS equipment.
RudyHuy@reddit
Yeah it was so surprising to me when I visited 10 years ago that I couldn't use contactless anywhere and in some places I even had to swipe.
Like a trip to the past.
Strazdas1@reddit
I remmeber seeing Steve Jobs claim that their iPod was the first ever portable digital player while holding my Creative MP3 player in my hands.
PM_ME_UR_THONG_N_ASS@reddit
lol you guys are such haters. Does the pixel or galaxy have satellite comms yet?
sean0883@reddit
That's a very niche thing to flex, but I'm happy for you - and this is exactly what I'm talking about. Those types of phones have existed for decades.
PM_ME_UR_THONG_N_ASS@reddit
Which pixel or galaxy can I get to replace that feature on my iPhone? Or do I have to carry a pixel and a garmin inreach?
sean0883@reddit
Again, it's niche. 99% of users won't use it. You're flexing something irrelevant. But yes, if I wanted it, I'd consider picking up an iPhone to replace the Garmin.
Munchbit@reddit
Or their competition lets a feature languish, and Apple takes the same feature, modernizes it, and apply a fresh coat of paint. At this point the competition notices how much attention Apple’s new enhancements is getting, prompting them to finally do something about it. Everybody wins at the end.
pattymcfly@reddit
It’s not just a coat of paint. They make it simple enough for the tech illiterate to use. For power users that means there are often traders that they don’t like.
sean0883@reddit
I couldn't care less about what they do with stuff to make it more accessible - if that's actually how they did it.
"We added (a feature nobody asked for prior), and made it so Android can never be compatible with our version of it, and its only for the two most recent phones. You're welcome."
The fact that I can receive high resolution pics/gifs via text from Apple, but not send them: Is definitely a choice. Our family and fantasy sports chats were kinda limited in the mixed ecosystem and caused us to move to a dedicated chat app.
pattymcfly@reddit
Completely agree on their bullshit with making android users a pain in the ass to communicate with.
Boomy_Beatle@reddit
And then other manufacturers follow. Remember the headphone jack?
metal079@reddit
I remember Samsung and Google making fun of them only to immediately copy them like the cowards they are
Ilktye@reddit
Which then others copy. Like removing headphone jack.
DavidsSymphony@reddit
Nvidia innovate way more than Apple it's not even close.
Bad_Demon@reddit
Lool how? G sync is dead, gameworks is dead, everyone can do RT and AI, nvidia just has a ton of money for marketing.
Boomy_Beatle@reddit
Man, I'm just making a joke, I'm not trying to encourage dickriding here.
Awankartas@reddit
Knowing NVIDIA they will make 5xxx series of card, release said feature, lock it behind 5xxx saying to all older cards owners SUCK IT and slap 2199$ price tag on 5090.
I am saying that as an owner of 3090 which now needs to use AMD FSR to get framegen. Thanks to it I can play C77 fully pathtraced with 50-60FPS at 1440p at max quality.
hampa9@reddit
How do you find frame gen in terms of latency? I didn’t enjoy it for FPS games because of that unfortunately.
Awankartas@reddit
Amazing. C77 without it while using path tracing is stuttery mess at 20-35fps.
hampa9@reddit
Don’t you still get the latency of the lower fps?
Strazdas1@reddit
The latency remains the same as before.
Awankartas@reddit
I mean i get same latency as 30fps. But framerate is A LOT smoother.
Kaladin12543@reddit
You could use FSR Frame gen with DLSS using the mods. You are not forced to use fsr.
Awankartas@reddit
DLSS3 frame gen does not work on 3090.
So yes i am forced to use FSR3 frame gen.
Liatin11@reddit
He's saying if you're willing to go onto nexusmods, there is a mod that allows you to use dlss 2 + fsr3 (frame gen)
Awankartas@reddit
So you are saying there is a mod that allows me to use DLSS3 framegen cool.
Kaladin12543@reddit
Cyberpunk update added FSR 3.0 which forces you to use FSR upscaler and FSR framegen together.
AMD has since releases FSR 3.1 which allows you to use DLSS upscaling and add FSR framegen on top of that because DLSS is the superior upscaler.
Awankartas@reddit
So you are saying i can't use DLSS3 frame gen after all ?
TBoner101@reddit
lmfao, (sorry). I'll try to explain it for ya, but this is gonna be LONG. Think of the two as separate technologies, upscaling (DLSS 2, FSR 2), and frame generation (DLSS 3, AMD's FSR 3.0). To all you pedantic fucks like myself, this is a simplification so bear with me — please and thank you.
Remember DLSS in the beginning? It initially launched w/ Turing (RTX 2000), then the much improved DLSS 2 arrived w/ Ampere for RTX 3000 cards. Then AMD released its competitor in FSR 1, then eventually FSR 2. These are upscaling technologies, the 'free' frames you get by rendering an image at a lower resolution, then upscaling to a higher res on your monitor/TV. ie: DLSS Performance on a 1440p monitor renders the game @ 720p, then 'upscales' the image which results in a minimal loss of quality (reason it's 720p is cause Performance renders @ 50% resolution, so on a 4K screen it'd be 1080p using Performance).
DLSS uses algorithms which along w/ the help of AI (via tensor cores), allows it to maintain or even improve visual fidelity depending on the preset, while AMD uses a software approach in FSR which is not as sharp but still incredibly impressive nonetheless (unless you're zooming in and looking for flaws).
Then DLSS 3 came out w/ Frame Gen, which is much more complicated than this but I'm gonna try to simplify it. Essentially, it examines two rendered frames: the previous frame (frame #1) + the following frame (frame #3) to guesstimate how a frame in between those two should look like (we'll call this frame #2). Then, it interjects this 'generated' frame in the middle of the two. Think of it as the meat in a sandwich or hamburger; you have the buns (frames #1 and #3), then FG is the meat, smack dab in the middle. So now you have 3 frames instead of 2 which as a result, makes the game look smoother, almost analogous to more FPS. But not quite; the downside is that it still has the latency of just two frames so it might "feel" like 60fps instead of the 120fps it claims that it outputs, along w/ other side effects such as artifacts and weird UI (ie: a HUD in-game).
The complex algorithms used by AI in DLSS 3 is accelerated thanks to specialized hardware (optical flow accelerators). However, RTX 3000 cards also have this hardware, just not as strong or as many as RTX 4000 cards, but Nvidia refuses to enable it. Also, DLSS 2 is heavily involved in frame generation, so much so that an upscaler is necessary for it. So AMD does the same thing here: software approach to a solution that Nvidia uses hardware (and charges us extra) for. It's called FSR 3, and uses the FSR 2 upscaler in the process. That is what the recent CP2077 update added. The big difference is that AMD allows everyone to use it; whereas Nvidia doesn't even let its own customers of previous generations use it. In practice, it's significantly more complicated than this.
However, these two pieces of technology, upscaling and frame generation, are inherently separate. They're just combined together in both DLSS 3 & FSR 3. That's what FSR 3.1 does. AMD separated the two so that Nvidia/Intel card owners can use their frame generation. What's cool is that someone like you w/ a card that supports DLSS 2, can use Nvidia's tech to perform the upscaling, then use AMD's FSR 3 to perform the frame generation. Since DLSS 2 > FSR 2, you get the best of both worlds (cause Nvidia won't give it to you). It's not officially supported by the game, but there's a mod that allows you to do it.
Anyone can correct me if wrong, but let's try not to make it unnecessarily more complicated than it needs to be. There should be no gatekeeping when it comes to knowledge and information, as long as an individual is willing to learn (and puts in the effort, ofc).
Awankartas@reddit
And now that you wrote all of that look at original comment and ask yourself how it relates to what you asked. Because i never talked about DLSS2 but DLSS framegen which doesn't exist on 3090 which forces me to use FSR frame gen.
Either you don't understand what frame gen is or you are just stupid .
Mind you i never even mentioned DLSS2 to begin with as it is of no importance.
TBoner101@reddit
And that's what I get for trying to help... Hopefully it helps someone else who isn't an ingrate.
So you're mad because I mentioned DLSS 2? Cause it's the one technology you already knew about? As if that would make you look any less stupid than you already do? Well then, my apologies. I underestimated your amount of knowledge (specifically, the lack thereof). Somehow, you're even dumber than I thought you were.
Can you read? If you actually tried to understand my post or the article linked, you'd realize why I brought it up. The reason is because DLSS 2 is integral to DLSS 3, you dunce.
Upscaling is a significant aspect of frame generation. While technically you CAN run FG w/o upscaling or instead of, I don't know anyone who does or why'd they'd want to (outside of w/ DLAA but only if it's supported and most people seem to prefer DLDSR anyway, which can be used w/ DLSS + works everywhere). Not to mention the end result is about half the frames.
Congratulations. As someone who has been using reddit for > a dozen years, you're simultaneously the rudest and dumbest person I've ever communicated with. Durr durr durr
Awankartas@reddit
First of all i am not mad. Second of all i never talked about DLSS2. My post literally mentions frame generation because shitty nvidia didn't include it for older cards.
For some reason you and other guy talk about DLSS2.
Right now i am using FSR3 in C77. Thanks to AMD not Nvidia my FPS are fucking awesome.
TBoner101@reddit
It certainly sounds like it. It also sounds like you speak ESL.
Since you state you're using FSR3 in Cyberpunk, then unless it's via mods, you're using FSR 3 Frame Gen with AMD's upscaler. The posters above are trying to tell you that you can use Nvidia's version of upscaling with AMD's version of Frame Gen, together (which is BETTER quality than FSR 3 by itself, which sounds like you're currently using), but you're so damn stubborn, arrogant, and ignorant to realize this, so everyone gave up trying to explain it to you except for myself.
Awankartas@reddit
There is barely any difference between DLSS2 or FSR3 upscaling mate.
And again. I never talked about DLSS2. It is something you guys brought. I have news for you. I already AM using DLSS2 with FSR3 framegen in C77.
TBoner101@reddit
Congrats. Want a cookie?
Riight, that's why it took five different people trying to explain it and yet you still were talking about DLSS 3's frame gen in the comments above mine asking if "there is a mod that allows me to use DLSS3 framegen ?", and then after that still said "So you are saying i can't use DLSS3 frame gen after all ?".
Liatin11@reddit
Dlss 2 upscaling + fsr3 frame generation but nvm you probably won't get it working
ButtPlugForPM@reddit
This is why i think AMD just can't catch up.
AMD might let's say be able to go..here is the 9900XT it's 1.4 times as powerfull as a rtx5090 at Rastar..
Cool.
But the average gamer,is still going to see that NVIDIA is still leagues ahead on the software
Frame gen/dlss/Reflex and NVencode are all far ahead of the software suite amd offers
psydroid@reddit
I always bought ATI dGPUs back in the day but all my recent discrete GPUs in laptops and desktops are from Nvidia. I don't need a discrete GPU for gaming, which I don't really do anyway, but I do need good software support for developing and running AI and HPC software.
It's something AMD has only come to realise very recently. I think AMD's future mainly lies in iGPUs, as that is where the majority of their market share lies. Nvidia could have something similar with ARM cores combined with Nvidia GPU cores, so I don't even expect that advantage (for x86 compatibility reasons) to stick around for too long for AMD either.
XenonJFt@reddit
ideas are running out. They have to make it stand out to not let 2nd hand market cut into their sales. competition will try to copy but it was a egh for a long time. DLSS1 was shit. DLSS2 is good. dlss3 was bad that didn't need new hardware or AI to be replicated properly. it's a slow downwards spiral that we don't k ow how many generations to find the ludicrous point
conquer69@reddit
And yet, AMD's framegen is worse precisely because it doesn't have the hardware acceleration that Nvidia uses.
From-UoM@reddit
Dlss 3 is so bad AMD is not doing Ai frame generation next.
OH WAIT. THEY ARE.
XenonJFt@reddit
That's DLSS2 smart-ass. amd made Frame Gen without AI gizmo
From-UoM@reddit
Uh no?
Dlss 3.7 upscaling is well ahead of dlss 2.0 upscaling
And no. Amd is doing AI frame gen as well.
https://www.tomshardware.com/pc-components/gpus/amd-plans-for-fsr4-to-be-fully-ai-based-designed-to-improve-quality-and-maximize-power-efficiency
Liatin11@reddit
go on to the amd sub, once they got fsr3 frame gen stopped being their boogeyman. It's crazy lmao
PainterRude1394@reddit
And once an AMD card can deliver a decent experience in path traced games suddenly it's not a gimmick and is totally the future.
chmilz@reddit
You mean once a feature is readily available and no longer elusive people will see it as a standard? Crazy if true.
conquer69@reddit
That is an irrational take. You are supposed to determine if something is good or bad whether you have it or not. Ray tracing is objectively better than rasterization even if I can't run it yet.
There is enough coverage about ray tracing on youtube to form an educated opinion about it. There is no excuse for being an irrational brand loyal contrarian.
PainterRude1394@reddit
No, I mean once ang AMD card can deliver a decent experience AMD fanatics with stop squeeling about how ray tracing doesn't improve graphics.
skinlo@reddit
Pretty elusive on Nvidia cards given how few good full path traced games are out there.
chmilz@reddit
You're overly concerned about the like, 8 people who can be called AMD fanatics.
Path traced experiences are elusive period. The only thing more elusive than AMD fanatics is games that have path tracing.
PainterRude1394@reddit
No it's a pretty mainstream opinion here. I'm not sure why you're so defensive about this.
f1rstx@reddit
“RT is just a gimmick and DLSS and FG - bad fake frames”. Now they love FSR, love AFMF and they will love RT when AMD will finally make card that capable running it. Honestly, AMD community is very cult like
PainterRude1394@reddit
And honestly I wouldn't even be talking about it if it wasn't everywhere on the Internet. Even mentioning it brings them out to defend themselves.
Name213whatever@reddit
I own AMD and the reality is when you choose you know you just aren't getting RT or frame generation
LeotardoDeCrapio@reddit
Not just AMD. There are people, who develop such emotional connection to a company as becoming offended by random feature sets in products, all over the internet.
You can see people willing to die on the most random hills in this sub, like Samsung vs TSMC semiconduction fabrication processes.
The ensuing word salads are hilarious.
ProfessionalPrincipa@reddit
What hill would that be?
Ranger207@reddit
A very, very small one
LeotardoDeCrapio@reddit
A silly one.
4514919@reddit
The AMD sub suddenly giving a shit about latency when for years they had worse latency than Nvidia because there is not Reflex alternative from AMD was peak hypocrisy.
OsSo_Lobox@reddit
I think that just kinda happens with market leaders, look at literally anything Apple does lol
zakats@reddit
Gotta say, I really don't give a good goddamn about RT.
ProfessionalPrincipa@reddit
I'm with you. I have about 40 games installed on my PC right now and only one of them supports any kind of RT and I'd estimate maybe only three others would make any good use of it if it supported it.
ProfessionalPrincipa@reddit
And the vast majority will not be able to run it without severe compromises because their video card only has 8GB of VRAM.
From-UoM@reddit
Maybe they willl add something that compresses texture through ai on vram.
They did release a doc on random access neural texture compression
DehydratedButTired@reddit
AI Makes money, graphics do not. Easier to prioritize AI components then adapt their gaming offerings via software. Its sucks but we're no longer their main monkey maker, they can't get 600% profits out of us.
Enigm4@reddit
I promise you that the 5090 will set new records, and I am not talking about performance. If there is no competitor card that comes within 50% of the 5090 performance they can charge even more than they currently do.
Cheeze_It@reddit
Not everyone will buy it though. I haven't bought Nvidia hardware in probably over a decade now. Those features are just not worth it for me.
vanBraunscher@reddit
Also it will have a massive performance impact for a decidely moderate uplift in fidelity. During the first few generations of the tech most people will have to squint long and hard to even see a distinct difference in comparison screenshots/videos.
But a very vocal subset of early adopters will flood the internet, tirelessly claiming that it is the most transformative piece of kit in the history of grafixx ever, and that the 400 bucks upmark for the ZTX 5060 is totally worth it (though you'll need ZTX 5099.5++ to get more than 35fps consistently, which is of course completely fine as well).
I know, I know, it sounds very outlandish and implausible that people would ever act this way, but what would be a lil' haruspicy without a bit of spice /s?
grillelite@reddit
He who stands to benefit most from AI.
auradragon1@reddit
I mean, he's one of the main reasons we're in this AI situation in the first place. Give him credit.
ExtendedDeadline@reddit
Credit for the recent enshitification of most companies and their massive pivots to primarily useless AI gimmicks?
SillentStriker@reddit
What? One of the few legitimate businesses in AI is at fault for the rest of the snake oil?
ExtendedDeadline@reddit
I would mostly attribute the current hype and garbage to open AI and Nvidia, yeah. Nvidia makes some great hardware.. and they'll hype whatever market is buying it.
auradragon1@reddit
It's times like this when I will dig into a poster's history who claims OpenAI is "garbage" and find something that ChatGPT could have helped him.
https://www.reddit.com/r/Cooking/comments/mm0uve/does_hard_water_effect_nonstick_properties/
A few years ago, @ExtendedDeadline asked that question.
This is GPT4o's answer. Honestly, the answer is far better than what the commenters on Reddit gave. Furthermore, @ExtendedDeadline wouldn't have had to wait for the answer.
JQuilty@reddit
How do you know ChatGPT isn't hallucinating?
CJKay93@reddit
Because it's correct lol.
Gaycob@reddit
Best to only ask questions you know the answer to then
CJKay93@reddit
Or you could just go and verify it..? How do you know whether to trust something on Wikipedia?
JQuilty@reddit
Then what value does ChatGPT provide if I have to continue to verify?
CJKay93@reddit
Great question. Let's ask it!
You can consider this summary human-verified.
Dog_On_A_Dog@reddit
This has to be a troll comment
CJKay93@reddit
Sorry, which point(s) do you actually take issue with? Because whether they were AI or human-generated doesn't really contribute to their relevance.
Strazdas1@reddit
All of them. None of them are true if the answer is a hallucination. LLMs as we have them now cannot provide overviews for example. They can make a best guess at what an overview would be without actually understanding the context.
CJKay93@reddit
All LLM answers are hallucinations, that's literally how LLMs work - this does not really address my question. What does it mean to "understand"? Why does it need to "understand" anything? Why is prediction insufficient? Does a calculator "understand" the equations it's given? How do you know it's right?
Strazdas1@reddit
Because without understanding, it cannot give an answer,it can only guess. Yes, the calculator understand the equation. It does not guess an answer, it makes a mathematical calculation.
Floating point issue has been known and scientific models use specific software to circumvent that.
CJKay93@reddit
It's not "guessing" anything, it's using correlations in its source data to predict output. You can generate the same output over and over and over using the same seed - it's not pulling things out of thin air.
Great, so if I run a linear regression on a calculator, is it suddenly "guessing"?
This has nothing to do with scientific models or software, it was a bug in the Pentium FPU that caused incorrect calculations for some floating point operations.
Kio5hi@reddit
Lets ask chatgpt if that's a troll comment
APR824@reddit
They can’t even be arsed to defend their own opinions, they delegate it to ai
auradragon1@reddit
Maybe you can verify for us if GPT4o's answer is correct.
JQuilty@reddit
You expect me to do that every time while you're pitching it as a general solution?
auradragon1@reddit
How do you know results from search engines are correct?
JQuilty@reddit
I don't, which is why I have to verify by the source being credible. LLMs are a black box that don't tell me where they're getting information from, yet you're promoting them as a cure all when they have the same problems in a worse way.
auradragon1@reddit
I noticed that you had an unanswered question on ZFS. https://www.reddit.com/r/homelab/comments/1dq5djr/zfs_issue_question/
Here's what GPT4o had to say:
JQuilty@reddit
Cool, boilerplate shit I already tried, is in the man pages, and can see on stackoverflow. You're not doing a good job selling LLMs as some magical better thing than regular search engines.
auradragon1@reddit
I tried it with another problem of yours: https://www.reddit.com/r/homelab/comments/qbmj1w/issue_with_nextcloud_via_docker_and_nginx_reverse/
Not only did it give the solution, it was much clearer than the comment solution you received on Reddit.
The issue you're experiencing with the Nextcloud desktop client is related to how Nextcloud generates URLs when it's behind a reverse proxy like Nginx, especially when SSL termination is involved. Even though everything works fine in the browser and on Android devices, the desktop client performs stricter checks and detects that the polling URL is not using HTTPS, which raises a security concern.
Root Cause:
When Nextcloud is behind a reverse proxy that handles SSL encryption, it needs to be aware of this setup to generate correct URLs using HTTPS. If not configured properly, Nextcloud may generate HTTP URLs internally, leading to the error you're seeing in the desktop client.
Solution:
You need to update your
config.php
to inform Nextcloud about the proxy and the SSL termination. Specifically, you should:Define Trusted Proxies: Include the IP address of your reverse proxy (Nginx) in the
trusted_proxies
array. Since Nginx is running on the host machine, you should use the Docker network's gateway IP or the host's LAN IP.Set Overwrite Protocol: Use the
'overwriteprotocol' => 'https'
directive to tell Nextcloud to use HTTPS for generating URLs.Set Overwrite CLI URL: Update
'overwrite.cli.url'
to reflect the external URL clients use to access Nextcloud.Updated
config.php
:Explanation of Changes:
'trusted_proxies': Added both
192.168.1.7
(your host's LAN IP) and127.0.0.1
to ensure Nextcloud trusts requests forwarded by Nginx.'overwriteprotocol': Set to
'https'
to force Nextcloud to generate URLs with HTTPS.'overwrite.cli.url': Changed to
'https://nextcloud.domain.com'
to reflect the external URL with HTTPS.'overwritehost': Optionally set to your domain to ensure consistency in generated URLs.
Additional Steps:
Restart Services: After making these changes, restart your Nextcloud container and Nginx to ensure the new configuration is loaded.
Verify Nginx Configuration: Ensure that Nginx is correctly set up to proxy requests to Nextcloud and that it's passing necessary headers like
X-Forwarded-For
andX-Forwarded-Proto
.Example Nginx Configuration Snippet:
Why This Works:
By informing Nextcloud about the proxy and enforcing HTTPS in its internal URL generation, you align the internal URLs with the external ones accessed by clients. This prevents the desktop client from detecting mismatches between the login URL and the polling URL, resolving the security warning.
Note on Updates:
Your Nextcloud version (22.1.1) is quite outdated. Consider updating to a newer version to benefit from security patches and improvements. Always back up your data before performing an upgrade.
Conclusion:
Implementing these configuration changes should resolve the error you're encountering with the Nextcloud desktop client. The key is ensuring that Nextcloud is fully aware of the reverse proxy setup and handles URL generation accordingly.
JQuilty@reddit
How the fuck is that cleaner than being told my eyes are bleeding and I had to uncomment a single line I commented while troubleshooting? Verboseness isn't clarity.
auradragon1@reddit
Because it actually explains the problem.
The model assumed that you commented that part out on purpose.
JQuilty@reddit
Cool man, I know what all of those do, I was just sloppy with comment lines setting it up.
auradragon1@reddit
You have to be an absolute moron to not see the value now. Just denial at this point.
JQuilty@reddit
I have two bachelor's degrees, in English literature and Computer Science. Believe me, verboseness is not a plus. Brevity is the soul of wit.
auradragon1@reddit
You can just tell chatgpt to be succinct. lol.
auradragon1@reddit
So what was the solution for your problem?
JQuilty@reddit
I thought ChatGPT knew all?
auradragon1@reddit
No one claimed that.
JQuilty@reddit
Nope.
auradragon1@reddit
So what was the solution?
JQuilty@reddit
Why don't you ask ChatGPT?
auradragon1@reddit
Here's OpenAI's chain of thoughts new model:
It sounds like you're experiencing issues with mounting your ZFS pool 'Media' after a reboot, even though the pool appears healthy according to
zpool status
. Here's a step-by-step guide to help diagnose and potentially resolve the issue:bash zpool status Media
Ensure all devices are online and there's no reported corruption.Run a Scrub:
bash zpool scrub Media
This will check for data integrity issues. Monitor the progress withzpool status
.Check for Stuck Mounts or Processes:
bash zfs unmount -a
Kill Stuck Processes: Use
lsof
orfuser
to find processes using/mnt/media
:bash lsof +D /mnt/media
Kill any processes that may be hanging.Inspect Mount Point Directory:
Check for Residual Data:
bash ls -la /mnt/media
If there are files or directories present when the pool isn't mounted, move them:bash mkdir /root/temp_media_backup mv /mnt/media/* /root/temp_media_backup/
Attempt to Mount Manually:
bash zfs mount -a
If It Hangs: Interrupt the process, and try mounting a specific dataset to isolate the issue.
Change Mount Point Temporarily:
bash mkdir /mnt/test_media
bash zfs set mountpoint=/mnt/test_media Media
bash zfs mount Media
Check Access: See if you can access the data at
/mnt/test_media
.Check System Logs for Errors:
Review Logs:
bash journalctl -xe dmesg | grep -i zfs
Look for any error messages related to ZFS or mounting issues.Inspect ZFS Dataset Properties:
List All Datasets and Properties:
bash zfs list -o name,mountpoint,canmount zfs get all Media
Ensure that thecanmount
property isn't set tooff
and that themountpoint
is correct.Check for Replication Issues:
bash zfs list -t snapshot
Look for any incomplete or corrupted snapshots.Cancel Stuck Replication: If there's a replication process that's hung, you might need to cancel it or clear any locks.
Update ZFS Packages:
Check for Updates: Ensure that your ZFS packages are up to date. Outdated modules can cause unexpected behavior.
bash dnf update zfs
(Replacednf
with your package manager if different.)Export and Re-import the Pool:
bash zpool export Media
bash zpool import -f -o readonly=on -o cachefile=none Media
Check Disk Health:
smartmontools
if not already installed.bash smartctl -a /dev/sdX
Replace/dev/sdX
with each of your drives. Look for any reported errors or failures.Inspect ZFS Transaction Groups:
bash zpool events -v Media
Look for any recent errors or warnings.Try Mounting in Single-User Mode:
Check for File System Corruption:
bash zdb -e Media
Be cautious withzdb
as it can be complex. Look for any corruption reports.Backup Critical Data:
Consult Logs for Kernel Messages:
bash dmesg | less
Search for ZFS-related errors.Consider ZFS Recovery Tools:
zfs_recover
might help, but they can be risky and should be used as a last resort.Seek Professional Assistance:
Additional Notes:
Summary:
The issue may stem from residual data in the mount point, a stuck replication task, or underlying file system corruption. By systematically checking each potential cause, you can isolate and address the specific problem preventing your ZFS pool from mounting correctly.
JQuilty@reddit
Thank you for demonstrating LLM's just regurgitate.
auradragon1@reddit
I mean, that's just not true. I asked GPT4o to provide a source for the answer. Here you go:
ExtendedDeadline@reddit
1) chatgpt is a good and real product. I am not disputing whether AI is real, I am asking what the ROI is and how often would a user rather pay for that benefit than just do two seconds of legwork?
2) chatgpt probably was literally trained off threads like the one you just linked to get me the answer.
I am happy you're an appreciator of my post history, though! 3 years ago is a good dig up.
auradragon1@reddit
Just today, I asked it to a create a Postgres SQL schema for me for a product I'm building. I just gave them an example of a competitor for this product. The schema made sense and came up with stuff I didn't even think about. It took the model 20 seconds to generate for a task that would have taken me over a day to come up with.
truthputer@reddit
When you are fired, no longer unemployable - and replaced with a small script that calls ChatGPT - will you still think ChatGPT is great?
I’m sure your old boss will like it. Until it replaces his job also.
And no you don’t get universal income, you’ll probably just waste it on food and rent.
auradragon1@reddit
Yes. It’ll be great. I own a ton of AI stocks.
ExtendedDeadline@reddit
I am happy for you and the others who can leverage it enough to give them money for the privilege. I don't really see the ROI for normal people compared to the input costs. Energy and hardware need to both become much cheaper for this technology to be highly leveraged by common folk, imo.
RabidHexley@reddit
This is the case for any rapidly developing new tech with lots of potential applications. But AI/ML does have the credentials to back itself up as being a highly useful avenue for technological development, It's like saying to internet is a shit idea because most websites were 100% shit when the WWW was on the come-up.
mercm8@reddit
"Snake oil exists, therefore I hate medicine"
Strazdas1@reddit
Medicine exists therefore ill call snake oil medicine.
PainterRude1394@reddit
You probably have no clue how common Nvidia's tech is used lol. Chat bots are just the tip of the iceberg. Tons of manufacturing and design systems are built using Nvidia gpus and software. Just because you dislike chatbota doesn't mean Nvidia hasn't been relentlessly driving innovation.
ExtendedDeadline@reddit
Only been posting here for 7 years and been into computer hardware for 20 years.. and see/use Nvidia in my company's products.. but ya, I'm sure I don't have a concept of Nvidia's use cases.
Reality is they are primarily valued as they are now because of AI, not because of their other use cases. They went from a <1trillion company to about a 3 trillion company in valuation only because of the chatgpt/AI surge.
Let AI ROI start to come to play and we'll see a re-evaluation of their proper valuation.
Intel and AMD are in almost everything too, as is Qualcomm. None of them are so richly valued as Nvidia and it's primarily because of that AI delta.
PainterRude1394@reddit
I'm clarifying Nvidia has done tons beyond driving chat bots.
ExtendedDeadline@reddit
Because the only reason Nvidia commands so much general attention as of late is because they are an almost 3T company, primarily on the rails of wherever AI goes.
On this sub, before AI, they were mostly discussed in the context of the gaming GPUs, applications towards BTC, some inference, and their acquisition/tech that came out of the Mellanox pickup.
psydroid@reddit
You should watch some of the GTC videos. Then you will realise that AMD doesn't have anything that comes close. Intel has been trying but hasn't been very successful mainly due to the lack of performance of their hardware, but otherwise OpenVINO has been more promising than anything AMD has come up with.
I read that AMD bought an AI company recently, so they may finally start taking things seriously and get their software stack in a usable state for developers and users alike.
red286@reddit
You're over-focused on chatbots.
AI is far more than chatbots. Those are just the most common consumer-facing application, because your average person isn't going to give a shit about things like protein folding.
We likely won't see massive benefits from AI for another ~10 years, but they will be coming and they will likely revolutionize a lot of industries.
kopasz7@reddit
If you want to talk credit, then it should go to the authors of the Attention is all you need research paper by:
auradragon1@reddit
There are a lot of people who deserve credit. I said he's one of the main ones.
punoH_09@reddit
When dlss is implemented well it double functions as anti aliasing and free fps with no loss in quality. Much better anti aliasing than the TAA style blurry smoothing too. Poor implementations are unusable. idk how they're gonna make sure it works well.
Enigm4@reddit
There are always visual bugs with up-scaling. It just varies how noticeable it is.
basseng@reddit
There's always been visual issues with any anti-aliasing method (outside of straight up rendering at a higher res and downscaling - aka supersampling).
MSAA for example (which many still gush over as the best AA solution) only worked on object (polygon) edges. So did sweet FA for any shaders or textures (which was painfully obvious on transparent textures like fences).
DLSS, or more specifically here DLAA is IMO is the best AA method currently available (or has ever been available) - so much so that if I could turn it on in older titles, even ones that I could run at 4k at 120fps+, I still would.
It is IMO just plain better than supersampling.
ibeerianhamhock@reddit
This is an excellent point. There's literally never been an AA method better, and none have actually *created* performance instead of costing it.
Gawd I remember a decade ago when we were all using FXAA because SS and MS were so expensive and it just looked slightly less shit than native and offered no TAA which is the worst effect to mitigate to my eyes. DLSS is miles better than anything we've ever had before.
Strazdas1@reddit
DLAA costs performance instead of creating it.
ibeerianhamhock@reddit
Yeah I misread, I thought the comment above was broadly talking about DLSS but also mentioning DLAA. You are correct...they were saying DLAA is the best AA method ever, I read their comment too quickly.
The cool thing is that it's not a huge cost. Like similar to like FXAA levels of performance hit maybe a little higher. Crazy now that we can basically take a shitty AA solution version of performance hit and get the best AA solution ever.
Strazdas1@reddit
Fences should be using multiple objects instead of transparent textures. Otherwise incorrect hitboxes.
jabberwockxeno@reddit
Except I don't like using Anti-aliasing at all
I prefer the jaggies and the better performance.
III-V@reddit
I do too. I just like the sharpness of the edges.
Enigm4@reddit
Yeah I have never been a fan of any anti-aliasing except super sampling. 2x usually works very well on 2-4k resolutions.
Rodot@reddit
Which is kind of the benefits of deep-learning super-scaling. It doesn't have to be perfect, it just needs to appear perfect
ibeerianhamhock@reddit
Yep. In modern implementations, the DLSS rendered image looks fantastic. You can only point out irregularities when comparing to reference truth...of an artificially created virtual world lol. It's gatekeeping imo that people take issue with this and everyone will be doing it very very soon.
Enigm4@reddit
This is just not true at all. You can easily make out errors in finer details. Especially small details that are in motion. Ghosting is easily visible.
ibeerianhamhock@reddit
It's tradeoffs. Some things look better than native, some things look worse, but the FPS you get in return makes the balanced tradeoff seem better overall imo.
StickiStickman@reddit
DLSS has worked well for me in every game I tried it, doesn't seem to be that much of an issue.
BausTidus@reddit
There is lots of games were dlss just completely destroys picture quality.
ProfessionalPrincipa@reddit
It's funny seeing polar opposite posts both being positively upvoted.
lI_Jozu_II@reddit
They’re both correct in a subjective sense.
“DLSS works well in every game,” says the guy on 4K who appreciates the performance boost and preservation of fine detail.
“DLSS completely destroys picture quality,” says the guy on 1440p who dislikes ghosting, motion blur, and jitter.
DLSS will always have caveats. It just depends on whether or not you’re personally bothered by them.
Tuarceata@reddit
Source/target resolutions aside, dev implementation makes a significant per-game difference in quality.
Deep Rock Galactic is an example where all upscalers artifacted like wild when they were initially added. They look fine now but anyone would be forgiven for thinking they absolutely destroyed image fidelity if that was their only example.
Strazdas1@reddit
of motion vectors are wrong you get a lot of artifacts. If motion vectors are missing, you get a lot of ghosting. this is all up to game dev to add.
Arenyr@reddit
Overwatch 2 has terrible smearing when seeing teammates or enemies through walls.
Jags_95@reddit
They are still using 3.5 and putting any 3.7 dll file gets overridden the next time you launch the game so the smearing remains.
XHellAngelX@reddit
Wu Kong also,
Accoring to TPU:
The DLSS Super Resolution implementation at 1080p and 1440p has noticeable shimmering on vegetation and especially tree leaves, and unfortunately it is visible even when standing still.
Surprisingly, the FSR 3 implementation has the most stable image in terms of shimmering in moving vegetation.
Accuaro@reddit
Guardians of the galaxy is trash with DLSS
Old_Money_33@reddit
Horizon Forbidden West is garbage with DLSS
XenonJFt@reddit
Uncharted 4, War thunder is completely broken on DLSS checked last month
llevifn@reddit
Since AMD is leaving the GPU race, this seems like an excuse to stop innovating.
Zexy-Mastermind@reddit
Why are you getting downvoted ?
Mr_ScissorsXIX@reddit
AMD is not leaving the GPU race.
I-wanna-fuck-SCP1471@reddit
Well they are where it matters to Nvidia, which is in the high-end.
Kryohi@reddit
Consumer cards above $600 are an ultra tiny portion of the market...
Strazdas1@reddit
Not according to steam survey.
AsterCharge@reddit
BREAKING NEWS ! AMD is behind Nvidia on top shelf graphics hardware. Just like every gen of GPU’s for the past 15 years.
I-wanna-fuck-SCP1471@reddit
My bad for clarifying, next time i'll just ignore facts.
martialartsaudiobook@reddit
And to increase prices even more of course.
temptingflame@reddit
Seriously, I want shorter games with worse graphics made by people paid more to work less.
PapaJaves@reddit
This sentiment is equivalent to car enthusiasts begging companies to make manual transmission cars and then when they do, no one buys them.
Strazdas1@reddit
Uh, you do realize that outside US, the vast majority of cars sold are manual, yes?
kikimaru024@reddit
It's not like there's a massive indie scene of smaller-scale games or anything that you could play.
Strazdas1@reddit
I think he meant he wants the games to actually be good though?
ExtendedDeadline@reddit
Give me more 2 do and low pixel 3d dungeon crawlers. I am actually over super high fidelity games with relatively mediocre stories.
DehydratedButTired@reddit
You just described the indie market. Not having to pay for marketing or c level management really keeps the cost of a game down and quality up.
LandscapeMaximum5214@reddit
Sad how AI is only there to replace their employees just to make these billionaires more money
LeMAD@reddit
Personally I want better games than the garbage we're getting these days. And games made for adults instead of 13yo boys.
Zarmazarma@reddit
Here are some great games that have come out this year:
This is by no means an exhaustive lists, but it covers games from a fair number of different genres, as well as indie and AAA games. It's also not even close to all of the good games that have come out recently, or that are available for you to play. I've been enjoying Fear and Hunger and Blasphemous recently.
Adonwen@reddit
Name them.
billistenderchicken@reddit
I’ve been hearing this same sentence since I was a kid. There’s a lot of amazing games out there, stop playing trash games.
InconspicuousRadish@reddit
What do you mean by games made for adults? One can enjoy Disco Elysium and Apex Legends at the same time.
There are tons of excellent games out there. The market is as competitive as it has ever been. 2023 has had some of the best games in recent memory. If you can't find quality games, it's really on you.
Unless you're buying obviously soulless microtransaction crap like Suicide Squad, this really isn't an issue.
OkStrategy685@reddit
yep. not even worried about gpu's or ai or gaming as gaming sucks now.
mackerelscalemask@reddit
Luckily for you, Switch 2.0 is due out in under six months time!
trmetroidmaniac@reddit
Finger on the monkey's paw curls. Graphics get worse but you still need more powerful hardware to run it.
koolaidismything@reddit
You’re in luck, cause that’s a best case scenario moving forward lol.
NeroClaudius199907@reddit
I will admit dlss is better than a lot of native aa now but I wish we had better aa for 1080p. Yes I know about deferred rendering
f3n2x@reddit
Why? DLSS at higher resolutions absolutely trounces native 1080p in quality no matter how much AA you apply. DLSS-P at 4k (which is 1080p internally and only slightly slower than native 1080p) is so much better than native 1080p it's almost unreal.
Munchbit@reddit
Because majority of users still run 1080p monitors as their main monitor. I’ve noticed games nowadays either look jaggier or blurrier (or both!) at 1080p compared to a decade ago.
f3n2x@reddit
Modern games are designed for higher resolutions (much more sophisticated lighting which also has to scale well up to 4k, so there have to be some trade-offs), that's why they can look blurrier at lower resolutions. They're certainly not jaggier. AA is as smooth as it's ever been.
Munchbit@reddit
Unfortunately, in some games, too smooth. You either get a jaggy mess without AA or a blurry mess with AA due to the nature of temporal anti-aliasing. You’ll almost always need to resort to sharpening filters to mitigate the blur.
f3n2x@reddit
Which, as I said, only really applies to lower resolutions. I understand that many people still have 1080p but they simply aren't well suited for modern games, especially games designed primarily for consoles.
Munchbit@reddit
Steam survey found that 56% of users are still using 1080p monitors, and they need good AA the most. I bet most of it comes from gaming laptop users. Not arguing against your point. I’m just highlighting the need of good AA techniques for the majority of users.
Strazdas1@reddit
Steam survey finds that i have 1080p, 1440p and 4k monitors. Even though i only game on one of them.
slither378962@reddit
/r/fucktaa
Aggravating-Dot132@reddit
Like what? Jedi Survivor? Cyberpunk? Those games, where TAA is extremely bad?
f3n2x@reddit
Like virtually - if not literally - any game with DLSS 2+?
From-UoM@reddit
DLAA?
NeroClaudius199907@reddit
DLAA is good but sparse... smaa t2x is nice... sharp and clear... The jaggies are there but i'll sacrifice. I'll take it
ShowBoobsPls@reddit
DLSS Tweaker lets you use DLAA in every DLSS game
From-UoM@reddit
Try DLDSR if you have GPU headroom.
Aggravating-Dot132@reddit
It makes less noise in terms of shimmering, but for fuck's sake, the flickering on some lights is just so fucking annoying.
I wish we could have a hybrid of some kind of Deep learning stuff for lines (like cells, grass and so on), but everything else being SMAA.
Zoratsu@reddit
DLSS DLDSDR?
DLSS 4K then DLDSDR back to 1080p
yUQHdn7DNWr9@reddit
I guess we will need an aiming reticle that skips over the inferred pixels, because shooting at hallucinations doesn’t sound rewarding.
azn_dude1@reddit
So what do you think the difference between upscaling and hallucinations is? Or even anti-aliasing vs hallucinating? Compute graphics is all about getting the most pixels for the least amount of work done. The idea here is sound, it all just depends on the execution.
Strazdas1@reddit
Shader deformation and tesselation is hallucination of game engine.
yUQHdn7DNWr9@reddit
In the specific case of computer graphics for games, the highest possible fidelity to the game state is as important as highest number of pixels.
azn_dude1@reddit
That's the case with any kind of feature that generates pixels without fully calculating them, but I don't see you brushing any of the existing ones off as worthless. Just AI bad I guess
yUQHdn7DNWr9@reddit
Haha no, it’s the 1:32 ratio that appears unsustainable.
azn_dude1@reddit
trmetroidmaniac@reddit
At this point Nvidia is an AI company with a side gig in graphics cards. I hope that this is all over before too long.
Thorusss@reddit
so you hope for the tech singularity? ;)
Rodot@reddit
The tech singularity started millenia ago when the first proto-hominid created the first tool that could be used to create another tool. It's all been an exponential technological runaway since then.
Strazdas1@reddit
It started slow but its accelerating. The train may look like its moving slow at first but by the time its flying by the place you are standing its too late for you to hop on.
LeMAD@reddit
Maybe saying AI 30 times during earning calls is soon to be over, but AI itself isn't going anywhere.
xeroze1@reddit
The bubble will burst. All the management are so heavily in the group think that they wouldnt take any sort of pushback. Like there is merit in AI but damn some of the business use cases pushed by management makes fucking zero sense from cost or revenue perspective.
I work in a devops role in a data science/AI team and recent when talking to the data science folks at the water coolers etc the common trend is that even they are kinda sick of all the AI stuff, especially since we have setup an internal framework to basically reduced alot of the stuff to just calling for the services like GPT/Claude etc so it just felt like a lot of repetitive grunt work in implementation after that.
For the business side, we know that there are some benefits, but the problem is that the best use cases for AI are all parts which are improvement of existing services rather than replacement of humans, so it turns out that there isnt much of a cost benefit, while the returns are hard to quantify.
Just waiting for the burst to come n brace myself for the fallout tbh.
gunfell@reddit
The financial benefits from ai have been measured and seem to be pretty substantial. There might be a bubble pop in nvidia’s stock price, but outside of that, ai will be printing money for decades.
The use cases expand as hardware improves. We have not even been through one gpu upgrade cycle yet in ai hardware since chatgpt became public.
Mercedes expects to have level 4 autonomous possibly before year 2030.
LAUAR@reddit
Source?
gunfell@reddit
https://www.bing.com/fd/ls/GLinkPing.aspx?IG=1FC0212808884722B8CF80CDC4C3D252&&ID=SERP,5212.2&SUIH=7ikZWseCNSrqAQdnn7H-JQ&redir=aHR0cHM6Ly93d3cuYmxvb21iZXJnLmNvbS9uZXdzL2FydGljbGVzLzIwMjQtMDItMDgvYWktaXMtZHJpdmluZy1tb3JlLWxheW9mZnMtdGhhbi1jb21wYW5pZXMtd2FudC10by1hZG1pdA
It is a bloomberg article on how ai is driving layoffs through efficiency gains. There are other ones too
Exist50@reddit
I'd be rather suspicious about how data-driven that decision is, vs a more investor-friendly spin on already intended layoffs. And I'm optimistic about AI's ability to replace humans.
gunfell@reddit
That is sorta reasonable. I think in certain things we know ai is AT LEAST making a some people more efficient. But obviously ai is still a neonate. I think in 6 years (when we have rtx 7000 series out plus time for models to be worked on) the tech companies that did not lean into ai will be regretting it a little. And every year the regret will grow a little
Thingreenveil313@reddit
Link doesn't work for me. Just takes me to Bing's home page.
JL3Eleven@reddit
I'll be taking my uber helidrone peasant.
gunfell@reddit
Do what works for you kobe bryant
College_Prestige@reddit
The dotcom bubble didn't cause the Internet to fade into obscurity.
xeroze1@reddit
It didnt, but a bunch of ppl lost their jobs, the direction of the internet went drastically different from what ppl were hyping up about.
Whatever AI will turn out will not be what people are hyping it up for right now. A lot of the useful cases we have will require years if not decades before it gets to a usable state. Those are not where most of the money is going. There are a lot of bullshit AI stuff that are just there to grab funding, to show that they are "doing/using AI" whatever that is supposed to mean, instead of building the fundamentals, data and software infrastructure to be able to adapt quickly to utilize the newer generations, newer forms of AI that will inevitably function very differently from the generative AIs of today.
Companies whose data infrastructure is so bad that they are still running on data with quality issues, running 20-30year old outdated systems trying to use AI in whatever business use case without understanding is what is so often seen these days. Those are the folks who will crash n burn, and it will be the poor folks working on the ground who will suffer for it.
auradragon1@reddit
Example?
I think the internet is way bigger than even they imagined it back in 1999.
Who knew that the internet would eventually give rise to LLM-based AIs?
currentscurrents@reddit
But they were right. Ecommerce is now a $6.3 trillion industry. The companies that survived the crash (like Amazon and Ebay) are now household names.
Generative AI needs more research effort to mature and faster computers to run on. But I'm pretty sure it's here to stay too.
CaptainDouchington@reddit
Because those managers probably have investments in Nvidia outside their job, so they use their job to bolster their portfolio.
I don't get why people can't grasp thats all this is. Everyone thats purchased AI chips from NVIDIA was heavily into them during bitcoin. Can't let that sweet sweet portfolio value drop.
conquer69@reddit
It's the same shit they are doing with real estate. Forcing employees into useless offices because the parent company owns the entire block of skyscrapers.
Competitive-Door-321@reddit
I think it'll be similar to, but much smaller than, the dot-com crash in the late 90's. Obviously that didn't lead to the internet going away; it was mainly just a consequence of the rampant overinvesting that had been happening.
Same thing is happening with AI. Tons of VC's are dumping massive money into AI projects with little prospects, like the Humane AI Pin and the Rabbit R1. A lot of that money is never going to see a return on investment.
But AI is here to stay. NVIDIA is right that it'll continue to be and actually increase in prevalence and importance, just like how the internet did. It'll probably follow a pretty similar trajectory, just a little quicker.
College_Prestige@reddit
I suspect all the hardware efforts by companies will eventually start being shut down or consolidated with other companies efforts. At some point they have to realize it's not really worth it to compete against Nvidia with such small scale
Exist50@reddit
I agree. There's no market for a dozen different ASICs. I think the end state would look like 3-ish vendors of highly flexible merchant silicon (Nvidia, AMD, [Intel?]), and the CSPs each having in-house teams. Sort of like the CPU situation today.
Caffdy@reddit
Amazon, Google, Apple and a bunch of other companies been researching AI hardware for quite some time, if there's gonna be a consolidation in the future, we will probably won't be alive to see it
Exist50@reddit
For their own uses, yes. But what about the dozen-odd currently independent startups? Who will they have to sell to?
i_love_massive_dogs@reddit
It's not like Intel ate the whole procssor market in the 2000s. On the contrary, there is a larger variety of processors being manufactured and designed than ever before.
GPUs are monstrously energy hungry AI accelerators, meaning they aren't suitable for every AI related task. There are plenty of diverse use cases for AI that benefit from more specialized accelerators.
Competitive-Door-321@reddit
Now that's a hot take if I've ever seen one.
RevolutionaryDrive5@reddit
Similar to you but i'm personally waiting for the internet bubble burst, i'm told it's going to happen soon or at least it was supposed to for the last 20 years... with that said i can't wait to go back to the old days of sending messages in smoke signals or via pigeons
auradragon1@reddit
Software engineer here. I don't code without Claude Sonnet 3.5 anymore. It's not that I can't. It's that it makes my life 100x easier when I need it.
LLMs are getting better and cheaper every single day. They aren't going anymore.
In my opinion, its under hyped. I experiment with AI tools early. I'm an early adopter. Some of the stuff that I've used recently gave me "holy shit" moments.
gartenriese@reddit
This reads like some kind of advertisement.
Little-Order-3142@reddit
It's my experience as well. It's just 20 usd/month, so it vastly pays out.
auradragon1@reddit
If it helps, I also subscribe to ChatGPT Plus for $20/month. Also, 1 other LLM service for another $20/month.
Krendrian@reddit
If you don't mind, what exactly are you getting out of these? Just give me an example.
I have a hard time imagining any of these tools helping with my work, where writing code is like 5-10% of the job.
HephaestoSun@reddit
Not the guy you asked, but it's great for creating SQL queries, small methods, Objects creation from rough idea, it's kind good at the boring stuff.
DiggingNoMore@reddit
Another software developer here. I don't even know what Claude Sonnet 3.5 is.
When ChatGPT came around, it sounded cool, so I thought I'd check it out. The website wanted my phone number in order to create an account. So I closed by browser.
That's the extent of my knowledge about AI. It sounds interesting, but I don't give away my personal information and I don't pay for subscriptions.
PostsDifferentThings@reddit
i also use an LLM to write code for me at work and its amazing
unfortunately that use case in no way shape or form is a reasonable business case to on-boarding AI in a large enterprise. there needs to be more than helping us techies write code.
if thats all AI is, it's not the billion dollar industry we thought it was. its still a huge business, don't get me wrong. but a code assistant is not the same thing as a consumer facing market segment.
kung-fu_hippy@reddit
The AI bubble will burst like the dot com bubble burst. A bunch of businesses will go out of business, but the core concept is likely here to stay.
xeroze1@reddit
That I agree with. A lot of stuff will change for good. The important thing is to make sure to survive the burst. I suspect those in pure tech companies and some hardware companies will take the hit, but industries which use AI prudently in areas where they are actually helpful will survive and have a second wind once the bubble bursts and we get all the BS marketing/unrealistic expectations out of the way.
ExtendedDeadline@reddit
It's not going anywhere, but it's mostly a gimmick that consumers don't want to pay for. Companies are spending billions in capex that doesn't get show a clear ROI for "AI services". Eventually, the hardware purchased needs to make money.
AI is real, but it ain't profitable unless you're selling the shovels.
DehydratedButTired@reddit
The bubble where companies will spend 25k on a 4k part will not last forever. Nvidia is capitalizing on no competition and a limited supply of silicon.
Banana-phone15@reddit
It already is. Nvidia’s biggest source of revenue is AI, 2nd is GPU.
Bitlovin@reddit
AI as a fad buzz word won't last long because people will get sick of hearing it (I know everyone here already is) but we're just on the tip of the iceberg in terms of corps buying up massive amounts of chips for AI purposes.
In terms of what it means for the GPU market, it's probably permanent. Yes, there's a lot of dumb promises being made about AI right now, but that doesn't mean there aren't profitable uses for these chips, and that's going to have a knock on effect in terms of GPU prices, because the fact of the matter is that the AI side of the business is permanently the money maker now. The gaming gpu hobby market is going to be insignificant and less and less attention is going to be paid to it, sadly.
Successful_Winner838@reddit
Lol. You're all insane if you think AI is going to do anything but continue to grow.
Enigm4@reddit
That side gig sadly crushes the competition.
PainterRude1394@reddit
Tbf the best gaming graphics improvements have been from Nvidia pushing the boundaries here. I think this is much better than just releasing a slightly faster GPU for more money due to rising transistor costs.
Admirable-Lie-9191@reddit
AI isn’t going anywhere.
dudemanguy301@reddit
The author of the article and by extension the comments here are fixating on upscaling but what’s being ignored is the general topic of “neural rendering”.
Using an ML model to upscale is small potatoes compared to the research going into ML models being involved in the rendering process itself.
Intel:
https://www.intel.com/content/www/us/en/developer/articles/technical/neural-prefiltering-for-correlation-aware.html
AMD:
https://gpuopen.com/download/publications/2024_NeuralTextureBCCompression.pdf
https://gpuopen.com/download/publications/HPG2023_NeuralIntersectionFunction.pdf
Nvidia:
https://research.nvidia.com/labs/rtr/neural_appearance_models/
https://research.nvidia.com/labs/rtr/publication/diolatzis2023mesogan/
https://research.nvidia.com/labs/rtr/publication/xu2022lightweight/
https://research.nvidia.com/labs/rtr/publication/muller2021nrc/
With AMD unifying RDNA and CDNA into UDNA and a commitment to AI upscaling for FSR4, I think the path is clear for a situation where all GPU vendors and all consoles, have some form of matrix acceleration hardware built in. At that point the door will be wide open for techniques like these to be leveraged.
SJGucky@reddit
In the future most NPCs will have some sort of AI. It is simply the next step.
Nvidia is already showing such NPCs.
EmergencyCucumber905@reddit
I remember being hyped about Oblivion's AI: https://youtu.be/BoljrqGCbwk?si=WQfamQa8WYxKUELm
Strazdas1@reddit
I wish they would just have some AI pathfinding. I play a lot of sim/strategy games and holy shit how many shortcuts devs take on pathfinding and making it a total hell in modding the game because it just cant handle new pathfinding. Funnily SPT had to rebuild the entire pathfinding mesh for bots to work properly. But not every mod author is going to do that.
That-Whereas3367@reddit
According to the shoeshine boys Nvidia will have a 100% share of AI hardware for the next century.
auradragon1@reddit
Eh... no one expects that.
That-Whereas3367@reddit
Pay a visit to r/NvidiaStock to see the level of delusion.
tukatu0@reddit
Never visited the sub nor will i touch it. But if what you say is true. Then it's just a propaganda. 1 person pumping the stock for their own benefit.
That-Whereas3367@reddit
It isn't pumping. They are technically illiterate individuals (and naive investors) who really believe Nvidia has a monopoly on AI.
Strazdas1@reddit
Depends on how you define monopoly. A 97% market share dominance would be defined as monopoly by most definitions.
tukatu0@reddit
Well. The propaganda techniques are so developed here on reddit. I wouldn't be suprised my friend.
It's not like they are entirely wrong. They are a major shovel maker selling to gold diggers. Except they also sell drills to iron diggers. They make diamond drills too. If you come up with a new drill they'll also help you develop it. So if someone makes it first. It's either them or google. It's just we know being first doesn't mean having a monopoly. Neither does it mean making the most money.
That's where the propaganda comes in. Best lies are half truths under specific lems. Just ban anyone who mentions that google has equal capability on the software side. That or they block your account and flood with their own posts or comment drowning out the small comments.
-WingsForLife-@reddit
If you go to a mental hospital you'll find people with mental health problems.
There's a reason they're not mainstream subs.
auradragon1@reddit
I don’t need to. I glance at AMD stock and have a good laugh often.
Strazdas1@reddit
Its hard to predict a century, but their software moat is certainly going to keep them at vast majority share for the next decade.
tavirabon@reddit
Your reminder that AMD propped up ZLUDA to run CUDA applications on AMD hardware and then sabotaged it once they started working the tech out, sending the whole project back in time. NVIDIA has never made contact with any such project, AMD is the one that wants hardware divide if it means using any already established ecosystem.
And for those that are particularly unaware of what possibilities lie ahead https://gamegen-o.github.io/ I can't imagine running this on windows without some sort of CUDA solution, AMD would have to build their own platform with their own exclusives if they don't want to adopt the standard already used by the entire consumer industry.
justjanne@reddit
They can't, legally. Not only would reimplementing CUDA reawaken the Oracle vs Google case, the real value is in cuDNN and even attempting to run that on AMD is not allowed.
Exist50@reddit
But Google won that. Same with Transmeta v Intel. The precedent seems pretty well established from a purely legal perspective.
justjanne@reddit
Google won on a technicality, the court also ruled that in most other situations copying APIs isn't allowed.
tavirabon@reddit
You're not understanding what happened. ZLUDA is FOSS, AMD gave some code to ZLUDA to help it out, once they established themselves as the primary CUDA solution, AMD threatened legal action. Not NVIDIA. Not Intel (because ZLUDA is for Intel too), but AMD.
justjanne@reddit
Then you're misunderstanding what happened. AMD employed the zluda developer until it became legally too risky to continue and let them go. Due to a miscommunication, the zluda developer was told they can publish that code, while that wasn't true.
yaosio@reddit
Don't forget about GameNGen. https://gamengen.github.io/ Although I don't think the method they use is going to scale, it shows that somewhat stable 3D worlds are possible.
salgat@reddit
We're going to get to a point where game developers will only need to fill a room with basic models and ML does the rest to bring it up to a realistic level. Won't even need textures, just tags on either the entire model or on each surface. And even cooler, at that point you can swap between realistic, cel shaded, etc trivially if you decide to change the style you're going after.
raddass@reddit
You could buy Minecraft render styles rather than skins
Schmigolo@reddit
Basically raytracing but for everything not just illumination. I'd be looking forward to games only needing artists and no coders, but if I've learned anything it's the artists are gonna be replaced first.
Caffdy@reddit
wtf is this take? video games wouldn't exist without software devs, and second, if anything, AI would replace everyone in the assembly line and most likely, only execs will keep their jobs, which will be a very sad state of affairs; the changes and transformations that AI will bring upon society are impossible to predict, who's gonna be replaced? in which degree? what will happen to the economy? money? etc.
Schmigolo@reddit
As a consumer I genuinely do not care the slightest bit about how a game is coded as long as it doesn't hold back my experience, I care about what I'm seeing.
A shit game with perfect code is just a shit game, a good game with shit code, is a good but flawed game. So you tell me which one of these two has more potential, and where AI can do more good.
Brisngr368@reddit
I think even Nvidias coffers will be dry by the time this happens
RabidHexley@reddit
"Work smarter, not harder." There is still so much progress to be made in this direction.
masterfultechgeek@reddit
semantics - I expect it'll be tensor acceleration (basically a bunch of matrixes stacked together)
donaldinoo@reddit
Would ai ups calling cause latency issues?
LiberArk@reddit
Or we can just say no to upscaling and force them to research better shit for actual performance. I'm sticking to native render and leave upscing for 1080p movies and shows.
pm_me_ur_kittycat2@reddit
How dare they not change the laws of physics that we're slowly butting up against with our tiny ass transistors?
LiberArk@reddit
Yes, keep buying AI upscaling. Soon the entire game will just be AI generated slop and we're back to impressionist painting levels of detail.
MrCawkinurazz@reddit
He tells what brings him more money ATM, first mining, now AI. These smiling guys just want your money, if you believe otherwise, you live in a bubble.
acAltair@reddit
Lying snake. Upscaling and frame gen have become a substitute for optimisations.
ischickenafruit@reddit
Guy who sells GPUs says he wants to sell more GPUs.
WildRacoons@reddit
just plug me into a simulation already
gingeydrapey@reddit
Without node shrinkage computing performance is really hitting a ceiling. Both in cpu and gpu.
ButtPlugForPM@reddit
i think he might be right though.
Is there Really that much more room for improvement on pure rastar front,like didn't the AMD dude say that sure they could make a Monster AMD gaming GPU that wins at raster,if we could have a 3.6ghz core clock
Brisngr368@reddit
I think there's always room for improvement (shit game engines aside ofc), there's a bunch of avenues that have never really been explored, the mcm designs I think are the next big thing.
Brisngr368@reddit
It's Jensen just preparing the market for when they release the next gen cards and it's the same 4090 chip with extra AI bollocks on it
SovietMacguyver@reddit
Do you all realize that he's just stating what he wants to be true?
truthputer@reddit
NVIDIA has to invent new markets to sell into, otherwise their stock - which is priced based on infinite growth - will collapse.
I’m fairly sure most artists and game developers want nothing to do with endlessly derivative generative AI slop.
TheAgentOfTheNine@reddit
AI peddler peddles AI, more news at 8.
Present_Bill5971@reddit
Really I just want vendor neutral APIs. Everyone's got AI cores now and ray tracing cores so now vendor and OS agnostic APIs. Then we'll get some new hardware that targets highly specific algorithms and have another set of hardware specific APIs to deal with until potential vendor agnostic ones
haloimplant@reddit
I worry about a divide between those with potato eyes and those who can spot the defects. Hopefully we have the settings to turn this stuff down because I foresee my eyes puking over 97% inferred pixels while the potato eye people insist everything looks fine.
Berkyjay@reddit
This guy can seriously get fucked.
RedditorWithRizz@reddit
We (gamers) are the ones getting fucked with these GPU prices currently
GenZia@reddit
Because Huang knows the competition can't hold a candle against him on the AI front.
It's all about slaughtering fair competition at the altar of his own home turf.
The guy did kill off the likes of Silicon Graphics (SGI), S3, and 3dfx with tactics that could be considered 'shady' at best and ATI, the last man standing, was in shambles by the time AMD acquired it in 2006.
Here's a nice, one-hour-long documentary covering INVIDIA's (means "envy" in Latin) colorful past in creating the autocracy it enjoys today, in case anyone's interested:
Nvidia - Anti-Competitive, Anti-Consumer, Anti-Technology
Competitive-Door-321@reddit
Or NVIDIA invested in AI early because they predicted it would be the future... ? It feels like you're trying to drum up unfounded brand hate for no reason. I'm confused why your comment got upvoted on this sub.
DehydratedButTired@reddit
Jensen didn't predict shit. They saw companies ordering gpus, then started adding functionality for their needs. Thats how nvidia works, they partner on software and tailor hardware companies making software for their hardware. Once they started getting more orders they added it to presentations. Then it blew up in the investing space and now they can print their own money.
ResponsibleJudge3172@reddit
In 2020 you were booing him for being boring and not getting to the hardware stuff when he was talking about a trillion dollar industry that AI had the potential to be
DehydratedButTired@reddit
2018 is when he started with his AI push for real. The RTX 20 series had AI cores with no purpose, they told people to ignore performance and believe in AI making up for it. They sold people 5% increments for double the price for years on those promises and they are just starting to deliver.
Jensen and Nvidia have never been boring. I've boo'd him for price gouging for years then for manipulating stock of his card to maximize profits. He would rather choke the market then provide more cards at a lower price. Nvidia pivoted quite well. They took a business model that gouged enthusiast gamers and applied it to ai startups with great success.
zarazek@reddit
Man, saying that Nvidia is not innovative is plain ridiculous: this is the company that first introduced geometry processing, programmable shaders, general-purpose compute API for GPUs, raytracing, neural network acceleration - and that's from top of my head. Yes, some of these technologies existed before, but were either experimental or used only in some super-professional products, a few order of magnitude more expensive.
About shady anti-competition tactics: almost every dominant market player does it. Intel did it. Microsoft did it. Google did it. And so on. I think that any other company in their position would do the same. That's the nature of monopolist or nearly-monopolist corporations. The people will forgive them as long as they are winning. The only thing that won't be forgiven is if the would become stagnant. And the fall can be really quick. Look at Intel.
amenotef@reddit
JonWood007@reddit
Keep in mind this guy is trying to sell a product. You can of course do it, he wants to force the market to move in that direction using de facto monopoly power.
redeyejoe123@reddit
AI for now might not be all that we envisioned, but since nvidia is making the hardware, eventually AI will reach a point where it makes nvidia hands down the most valuable company in the world. Imagine when they can have a true virtual assistant for front desks, secretaries which do not need a wage... AI will replace many many jobs, and I am not sure how I feel about that, but it will happen. For that reason all in on nvidia stock....
kilqax@reddit
Ehhhh I'm not very keen on them showing this take. Not at all, actually. Simply because whatever they choose can change the whole market.
Slyons89@reddit
"You will need our proprietary technology and systems to continue producing your work or enjoying your hobby". Guess it doesn't change much when there's a lack of competition either way.
rrzlmn@reddit
I like the idea of a diffusion model with input from engine state, or a game rendered with low quality and using a style-transfer model to transform it into a realistic frame. If we can achieve that, it would be a huge jump in graphics with far less compute and development time.
CaptainDouchington@reddit
"Please, when I say AI, our stock is supposed to go up!"
MewKazami@reddit
I just hope one day it happens to them just like how they killed a closed API like 3Dfx with their superior OpenGL and D3D performance, one day comes a company that absolutely shits on them and proves them wrong, it won't be AMD so my last hope is another trash company Intel. Maybe one day some Chinese one will beat them.
HybridPS2@reddit
What a load of horseshit lol. How about studios creating games with a strong art direction and style, instead of cramming more and more polygons on screen?
Or maybe spent a couple months doing optimization before release?
Xpmonkey@reddit
still rocking the 1080ti like..... riight!
BrightPage@reddit
Why am I forced to pay more for their fake hallucination rendering? I want hardware that can natively render a scene for these prices goddamnit
LeotardoDeCrapio@reddit
... and then the thunderous mechanical key switches of triggered gamers, with little disposable income, could be heard in r/hardware
Old_Money_33@reddit
They need to justify AI on consumer chips, create a need in that way the unified arch (like the announced UDNA from AMD) makes sense instead of an optimized gaming uArch and a compute AI uArch.
AMD is going the same way with UDNA, they see the workload converging and that makes gaming only uArch not desirably by them (nVidida and AMD).
Belydrith@reddit
Well, the hardware divide always happened in the past as well, back then it just meant a generation delivering 70% additional performance, leaving those on older hardware behind eventually. Those gains are unrealistic nowadays and instead features like upscaling will make a more binary division.
The fact that you can still run most stuff these days on a 10 series card alone should be enough evidence that it's really not much of an issue at this time. Hardware is lasting us longer than possibly ever before.
mb194dc@reddit
Or you can just turn down the details. Upscaling introduces artifacting, shimmering and other loss of display quality. It's s step back.
The main reason they're pushing it, is so they can downspec cards and increase margins.
Famous_Attitude9307@reddit
So the 1080 ti doesn't work anymore? Got it.
SireEvalish@reddit
AI upscaling allows for higher image fidelity without having to spend GPU horsepower on the extra pixels. It makes sense to allocate those resources to things that have a larger effect on perceived visual quality, like lighting, draw distance, etc.
Real-Human-1985@reddit
We can’t work without these tools that we sell.