How to Actually Make HDR Look Amazing on Windows 11 (Not Washed Out Trash)
Posted by ArcBeev1@reddit | buildapc | View on Reddit | 62 comments
Alright folks, I wanted to drop my full HDR setup guide since I’ve seen a ton of people say “HDR looks washed out, SDR is better.” Nah… HDR can look absolutely insane when you set it up right. Most people just never dial it in properly.
Quick background: I do professional filmmaking and color grading, and I’ve been gaming for almost 28 years now—daily. So yeah, I’m pretty obsessive about how my games look.
Here’s my full step-by-step breakdown:
Step 1 – Monitor settings
- Kill any “enhanced sharpening” modes your monitor has, they usually make things worse.
- Stick to sRGB mode in most cases.
- Adjust brightness to taste, but don’t go crazy—use in-game brightness sliders for fine tuning.
- Make sure HDR + VRR are enabled in your monitor’s settings.
- If sharpening is needed, use the game’s built-in sharpening or ReShade, not the monitor.
Step 2 – Windows HDR settings
- Disable Auto HDR (seriously, it’s trash). Only enable real HDR in Windows.
- Calibrate brightness (I find 47 is a good baseline, but it’ll depend on your display).
- Use the Windows HDR Calibration Tool from the MS Store. Only touch brights and darks—leave saturation alone.
Step 3 – NVIDIA app settings
Forget RTX HDR—it’s useless hype. Instead:
- Global settings → G-Sync ON (fullscreen + windowed if supported).
- Display settings → Set native resolution + refresh rate. Test carefully: running highest refresh rate can sometimes break 10-bit HDR if your cable/port can’t handle it.
- Scaling → Just leave it default, no reason to mess here.
- Color settings →
- Use NVIDIA color, not default.
- Desktop color depth: 32-bit
- Output format: RGB
- Output color depth: 10-bit (if supported)
- Output dynamic range: Full
Now the important part—color channels. Don’t just leave it at “All channels.” Adjust r/G/B separately, otherwise you’ll never get proper HDR.
Here’s what I use (tweak to taste):
- All Channels → Brightness 100 | Contrast 119 | Vibrance 83
- Red → Contrast 120 | Vibrance 83
- Green → Contrast 120 | Vibrance 83
- Blue → Contrast 119 | Vibrance 83
HDR instantly stops looking washed out with this. Even on a 400-nit monitor, it looks way better than SDR once calibrated.
Step 4 – Per game tweaks
Some games only work in true fullscreen HDR, some only in borderless. Example:
- Resident Evil Remakes → true fullscreen works best.
- Diablo 2 Resurrected → HDR looks better in windowed (borderless).
Pro tip: you can use the “Windowed HDR Workaround” DLL for games that don’t behave, such as RE Remakes. Pair that with NVIDIA’s per-game DXGI swapchain setting.
Example settings I run for Diablo 2R:
- Monitor Tech → Fixed Refresh
- OpenGL GDI → Prefer Compatible
- Vulkan/OpenGL Present → Prefer layered on DXGI swapchain
- V-Sync → Use app setting
This allows D2R to use HDR correctly in Fullscreen, not just Windowed.
Step 5 – In-game HDR setup
Tweak brightness, contrast, and black levels per game. There’s no universal rule here, test and lock in what feels right.
Step 6 – ReShade (optional but amazing)
For games without good sharpening, I recommend ReShade. My favorite is Simple Realistic for RE4 Remake 2023 by Crubino (works with a bunch of games, not just RE4). It’s lightweight, not bloated, and works really well on top of the NVIDIA color tweaks above.
Step 7 – Frame pacing & latency
- Use MSI Afterburner + RTSS.
- Cap your FPS in RTSS (not in-game, not in NVIDIA app).
- Use Reflex with RTSS frame cap for the smoothest frame pacing + lowest input lag.
- Don’t use V-Sync or “low latency mode” in control panel.
RTSS is hands down the best way to cap FPS—no debate.
And that’s it. Follow these steps and HDR stops looking “washed out” and actually looks like the upgrade it’s supposed to be.
Enjoy your properly tuned HDR 😎
DETERMINOLOGY@reddit
My opinion, Buy a SOLID OLED TV, Ive seen HDR on a S90c, S90f and it looks FAR better then any OLED monitor to date
That alone is why people say their staying with SDR but its usually from those using a pc gaming monitor even the PG32UCDM its good for HDR and one of the top rated monitors but the s90c even makes it look extremely bad with brightness and color vibrancy. Totally different levels
Also i wish HDR was widely supported on pc like it is on console, Like every game works with HDR without any hassle
Melodic-Luck-8772@reddit
my brother, he understands it.
monitors, even the NEWEST and BRIGHTEST TANDEMS suck at HDR. really.
even the newest PG32UCDM or PG32UCDM3 or whatever they are called suck at hdr.
i do have a LG 32GS here on my left which does like 600nits at 10% and peaks at 1300 nits at 2% apl.
when i compare that to my C5 42inch at 800nits for 10% apl... it sucks.
really, if you want good monitor HDR, aim for that 600nit and 10%. that should be adequate.
the issue, that there are only like 2-3 models that can do this 600nits at 10% sustained. even the newest tandems do like 500 nits at 10% windows and 400 nits at 25%, and that is unfortunately dimm.
instead of giving us monitors that can achieve real brightness, we get stupid tonemappers that artificially brighten the image.
AngryWildMango@reddit
or if you need a monitor. coolermaster tempest gp27q. Qdot, miniLED, 1200nits, very color accurate, 165hz, 1440p, really good HDR. It has software quirks, but if you need the features I listed and can't afford an oled thats comparible. It's so damn good. And stupid bright lol. 1200+ish nits at full screen is intense.
(I would love an OLED with the same features tho!)
DETERMINOLOGY@reddit
I still would choose a OLED tv over any monitor. TVS even the s90C can overall get much brighter then any monitor and the HDR impact is much more robust its not even close. FOR HDR its OLED QD OLEDS to be exact
burningpizzacrust@reddit
As someone who does a lot of work on display and optics, this whole thing makes me sad.
You aren’t calibrating anything, you’re just setting your preferences based on what “looks” good to you. What makes HDR good isn’t just colors.
Sure it may be better than what Windows gives you, but that doesn’t mean much.
Willing-Coconut8221@reddit
Ok dude then like, can you tell me how to make it actually look not bad? I’ve been trying to make hdr look good even with hdr supported hardware and its all just looked like shit, colors have been washed tf out
Willing-Coconut8221@reddit
I managed to fix it being washed out by using hdmi instead of usb c, but now its all RED
ArcBeev1@reddit (OP)
What’s sad is how quick people are to jump on the toxicity train here. This guide is clearly aimed at the average user — the kind of person who just bought a PC, turns on HDR, and wants it to look good without spending hours tweaking.
For advanced users, sure, you’d go with something like ReShade (which I literally mentioned in the post) to get access to all kinds of shading options. RenoDx is cool too, but it doesn’t even work with every game — only the ones it supports. And it’s not the only tool out there anyway.
People act like the average user is just going to magically know all the stuff we know… they don’t. Some don’t even know where the Windows HDR settings are in the first place.
This isn’t supposed to be some hardcore modder’s guide — it’s a straightforward, universal setup that works across the board and makes HDR actually look great out of the box. Nothing “sad” about that. It’s just a useful guide, not an invite for nitpicking.
AngryWildMango@reddit
if your guide is aimed at the average user. you are just misleading them. not helping. also if someone doesnt know about windows HDR settings and has an hdr display. they wont being looking up anything to find this post anyway.
also it is an invite for "nitpicking" you posted incorrect info on a forum that anyone can reply to.
well its not incorrect, if you like it thats sick, but its misleading when you say "this is the correct way to do this" then it very much isnt. its your way of doing it.
also RTXHDR isn't hype lol go look up a comparison video with that and renodx. Reno is better, but only VERY slightly.
ArcBeev1@reddit (OP)
I didn't know mangos can get so angry hahaha
yeeeew99@reddit
It's not toxicity, it's advice to others - the problem is, is that when you don't calibrate it for accuracy, the settings are all then very specific to your particular panel / setup (which you haven't listed), and they won't look the same to anyone else using a different system.
So you can't just say it's for the "average user" as an excuse for it not being an accurate calibration. Also, throwing in tools like reshade kind of invalidates it, as reshade is just a post filter that I would definitely not use to calibrate an image/game. Combined with the fact that you're cranking up your contrast to 120 and your vibrance to 83 makes me think that A) you have a very poor implementation of HDR on your monitor, and B) you definitely don't work in "professional filmmaking and color grading"
ArcBeev1@reddit (OP)
Rage Bait
burningpizzacrust@reddit
I mean, are we even sharing the right image? There’s compression, different formats, and who knows what these file sharing services are compressing data again.
Would you be able to provide be at least a 10 bit image if tiff format? And you’d also think that I would have access to a calibrated display with more than 10 bits and infinite contrast ratio to truly see what you’re seeing?
ArcBeev1@reddit (OP)
Excuses, you proved my point and as for the obvious redditor toxicity here is a link https://youtu.be/IcrbM1l_BoI?si=LEgqUe6VE_4khaFW
burningpizzacrust@reddit
And why do you think YouTube will not crush any of these 10+bit quality etc? It’s really not helping the case of “properly tuned” when it’s just aesthetically good.
I have nothing against your instructions to how to use HDR functions to get “more aesthetic” images in game and desktop. Calling it “properly tuned” is where we are disagreeing.
yeeeew99@reddit
Mate don't waste your time. OP isn't even contributing to the conversation about color accuracy and balance.
ArcBeev1@reddit (OP)
A sheep has the same grazing patterns
yeeeew99@reddit
Never Put Passion In Front Of Principle. Even If You Win, You Lose.
ArcBeev1@reddit (OP)
Kings and Queens are but one in millions.
ArcBeev1@reddit (OP)
You have an IQ of a pigeon
vegtro@reddit
So much incorrect information here. In game has to match your peak brightness of your monitor. If not, you will get clipping and losing detail. Also adding too much contrast will add shadows or dark places making it darker than it should be. If you want it the image to be more saturated, just use digital vibrance. No mention of RenoDX.....
Melodic-Luck-8772@reddit
yea, anyone who is giving HDR coloration via nvidia or amd or hdr calibration tool is doing it wrong.
you want HDR for accuracy.
sdr is broken on modern monitors/tvs if you dont clamp the colors.
so whats essentially is happening is that oleds for example map sdr colors to their wide gammut so they get scretched beyond their intended range. then they appear oversaturated.
this is an issue oleds have.
some people like this oversatured look, but its actually wrong.
THEN, when they switch to HDR, all the wrongly mapped colors are mapped correctly and less saturated.
this is where people are thrown off and think that HDR is broken, when in reality their image is fixed.
xDeadly95@reddit
Luckily you can clamp them to srgb on asus oleds, or by setting dcip3 color mapping in windows they get less saturated, although I don't know if they're correct since I've heard so many people saying to stick srgb.
kurushimee@reddit
I use my QD-OLED in SDR mode without clamping the colors at all. I am, and absolutely always was aware of that it's technically not supposed to be this way. However, I myself couldn't possibly give less fucks — it simply looks so much better "oversaturated".
And, subjectively, what I think is that — absolute majority of devices viewing any kind of SDR content always oversaturate it — considering that the average user isn't going to change any settings at all, there aren't that many people actually clamping down SDR to sRGB. So, by itself, SDR is made in sRGB because it's a widely supported standard — not because it's how it's "supposed to look like". When creating SDR content, creators pretty much have to admit that most people won't view it in actual sRGB.
Also, even more of a subjective point — sRGB absolutely doesn't hold up in terms of realism. The maximum vibrant of a color that sRGB can achieve looks very dull compared to colors you see every single day IRL. Oversaturating the colors in SDR actually looks closer to reality, since IRL most things surely don't look washed out.
Melodic-Luck-8772@reddit
yea sure, if you like it that way use it that way. its your preference.
i like the colors more accurate.
how you use your display, is up to you.
some people like the redish skins, or grass that look like its on crack :/. i dont.
its a preference thing.
of course devs are aware that a lot of people are looking at broken sdr colors.
me personally dont like those oversaturated color artifacts.
kurushimee@reddit
well, at the same time, I of course don't like red skin or toxic-green grass or anything like that. I like what looks natural to me. And unclamped SDR on my QD-OLED sure looks natural to me, doesn't look one bit oversaturated — the skin tones and greenery colors all look natural.
My preference comes with the fact that clamped-down sRGB looks unnatural to me, not that I like oversaturating specifically for the sake of saturation. sRGB simply looks wrong, looks washed out
Melodic-Luck-8772@reddit
it also depends on the profile youre using and what color settings in your monitor settings.
some monitors are more ,,red-ish" than others.
depending on the manufacturer this can be a bit overtuned.
maybe you are already using your SRGB profile on your monitor, who knows.
also in general, if you like use HDR, or clamp colors or whatever, let your eyes adjust. cannot stress this enough.
the best way of course to do it is to leave HDR enabled all the timeand find a good setting for that win11 content slider. uppon enabling HDR the SDR colors get mapped correctly.
if you one day decide to clamp colors, only use one clamping method. never double clamp :).
ArcBeev1@reddit (OP)
Look at it in from different perspectives. Not everyone is using an oled, if so the average person has a high enough IQ to know if they do have an oled they would be looking for a guide tailored for oldes.
This is for monitors that have washed out looks, mainly IPS and VA.
It's a very relatble easy to use guide that most people will be familiar with, and even those who just want convinence.
People are mislead into disabling and enabling HDR everytime per appllication, which is just not feasable in real world use cases.
Maybe windows 12 will fix all the issues, but for now, we stuck with windows 11.
Still better then windows 8 :D
ArcBeev1@reddit (OP)
You clearly didn’t read the guide properly or test it out. I specifically mentioned multiple ways to adjust contrast, and also noted that results can vary depending on your setup. The point of the guide is to provide a strong baseline for making HDR look great.
Adjusting contrast is essential for getting proper blacks and whites, and from there you can use either ReShade, in-game settings, or both for finer tuning.
Some people will always be toxic for no reason—that’s just how it is. Luckily, the ones who actually understand what I’m saying will get it.
vegtro@reddit
Sorry, on step 5, there is a universal rule. If you’re a filmmaker, you do know about peak brightness correct? You can’t just put any number in there. You do not need to adjust contrast if you use Lilum black level raise fix on certain games or just use RenoDX. Also you can’t just say use your settings if you do not explain what monitor or tv you’re using since they are all different and not universal.
ArcBeev1@reddit (OP)
Not every game treats every setting according to rules.
ArcBeev1@reddit (OP)
Also you mentioned clipping, and again I stated it can depend per game. Some games don't clip some do.
Just_Maintenance@reddit
With "properly tuned HDR" do you mean accurate or just "looks good"?
In my experience HDR works decently when watching HDR content (as long as the brightness is setup correctly), but Windows has no idea how to map SDR content onto an HDR display and it all falls apart. AutoHDR and RTX HDR can bridge the gap in games by trying to fake HDR.
The only net positive experience I have had with HDR is with MacOS. It looks perfect all the time (as long as the display is high-resolution).
Melodic-Luck-8772@reddit
windows even sets sdr gammut to wide gammut. you have an option called auto color management, that clamps srgb to its respected color space and normalize sdr colors.
so when you switch to hdr, the colors are actually mapped correctly. xD
ArcBeev1@reddit (OP)
Precisisly, however it's a rigmarole to keep switching, hence this easy guide was formed. Most people don't understand that it's literally the best universal method actually. Because I can assure ytou 90 + % off people never even turn HDR on lol
You get the point " I think "
Ratemytinder22@reddit
Your "guide" literally tells people to clamp the monitor at sRGB "in most cases" and use HDR. Literally the dumbest thing I have heard lol
This "guide" should be called "How to get oversaturated and incorrect colors"
ArcBeev1@reddit (OP)
Yeah, I can tell you probably jump into a game once in a while, whereas I’ve already played and finished 20+ games just this year. Auto HDR and RTX HDR aren’t true HDR—they’re just tone-mapping tricks.
The reason people fall back on them is because they never bother adjusting contrast properly in the NVIDIA app. Instead, they tweak it in SDR, then complain it looks bad.
Just follow the guide, lol.
ArcBeev1@reddit (OP)
Here is a link to what my desktop looks like to get an idea https://imgur.com/a/kYdvU8b
Ummgh23@reddit
My brother in christ.. a screenshot is not going to show anyone what it looks like on your monitor lmao
ArcBeev1@reddit (OP)
here have a brain 🧠
Ummgh23@reddit
AI Generated slop post
deleted_by_reddit@reddit
[removed]
buildapc-ModTeam@reddit
Hi there! Thanks for the comment.
We ask that posts and comments be in English so they can be understood by as many people as possible. Translations on Reddit are client-side, and not all apps or browsers support auto-translate. Currently many users (and moderators) aren’t able to read your comment.
Could you please resubmit this in English?
Setheleh85@reddit
I have an Odyssey G9 and feel like I got a oled one. Thank you so much.
ArcBeev1@reddit (OP)
Nice monitor
Melodic-Luck-8772@reddit
giving it more color in nvidia is not going make anything fixed, accurate or better looking.
you say that youre a professional color grader and film maker. but giving it more ,,color" or artificial ,,brightness" over the nvidia or amd software is not the way to go.
as a professional color grader and film maker you probably should now that ALL oleds or never monitors/tv's are wide gammut and map srgb colors to wide gammut, thus making them appear oversaturated.
so basically what we are all seeing on our IPS and OLEDS is srgb colors that are strechted over their intented range.
so these oversaturated colors are kinda in a category like ,,artifacts". so called coloring artifacts.
you as a professional color grader should also know that its like this.
the moment you enable HDR on windows, all the ,,wrongly" oversaturated colors in SDR mode get mapped to their correct HDR color gammut and ,,normalize" again.
we are just thrown off by that because we are all looking at wrong sdr colors.
all you literally have to do is adjust the sdr content slider to paper white by opening a pure white image and holding a white paper in real life next to it... that easy. you just have to eyeball this.
what i recomment do everyone using OLEDS or IPS using HDR.
clamp your srgb colors, and use it for like an hour to get a feeling how SDR should look like.
some tvs or monitors have a function or profile for this.
you can also use, lets say gamer 1 on your monitor and use windows acm or your gpus software to clamp srgb colors down to srgb..
then with a ,,better eye" and feeling for the brightness and colors, you should go and try to look into HDR.
of course, if you look at oversaturated sdr colors for your whole life, youre going to think that hdr looks bad.
but in reality, those normalized colors are the ,,actual" colors.
here are some calibration tips:
do not put any ariticial color or brightness on windows machines.
you can indeed put brightness and peak brightness on MAXIMUM and 100% on most modern tvs or oleds.
TV calibration for w11 hdr tool:
if you want to play with DTM, turn it off first, then calibrate and enable it afterwards. this way your HDR metadata is not altered.
if you want HGIG you ENABLE it while calibration. its designed that way.
the rest is done via the TV. in HDR you mostlike will only have to adjust peak brightness and oled brightness.
if your tv has a gaming mode, use that. dont overdo the colors.
you want HDR to fix your colors and brightness, not mess them up.
for a monitor:
peak brightness max
100% brightness
0% saturation, not in nvidia/amd or in w11 hdr calibration
the rest ist most likely handled by your monitors algorythm.
then you can adjust black levels with settings like black stabilizer.
if you dont watch HDR conten, go back to sdr.... windows SDR slider is hard to get right.
ArcBeev1@reddit (OP)
Thing is, this is a sort of universal guide for most people. And actually the most practical at that.
"Quick backround:" was to give context as someone with experiance, I know what the most average users will relate to.
prohealthypets@reddit
Idc to read this dude yap he’s clearly just tryna flaunt and is the exact people that give Reddit that “erm actually” vibe anyways thanks for your concise tutorial the colors look amazing!
AngryWildMango@reddit
if you are happy thats great, but this isn't erm actually. It's just the real factual information. lol its not an opinion.
ArcBeev1@reddit (OP)
You're welcome and ofcourse do what's best for you
JayViTales@reddit
As someone passing by im abit lost by the last lines - changing my monitor to 0 sat makes everything greyd/loss of color. Is this just to test the black levels or values?
AngryWildMango@reddit
I just tested the settings on my Cooler Master Tempest GP27Q. It looks like shit. Please remove the "properly tuned" it's literally the opposite. I really don't want to come across as being mean and rude. If you like it/it works on your display, that's great. But don't fucking lie to people who don't know what they are doing. It's spreading misinformation
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hi there! Thanks for the comment.
We ask that posts and comments be in English so they can be understood by as many people as possible. Translations on Reddit are client-side, and not all apps or browsers support auto-translate. Currently many users (and moderators) aren’t able to read your {{kind}.
Could you please submit a new comment in English?
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
justin38383@reddit
What's with amd GPUs?
sgtNeXu5@reddit
Hey might be stupid question just wanna know more where you are adjusting and tweaking the color?
Certain values dont go over 100
ConfidentHistorian89@reddit
he adjusted in nvidia app, not the nvidia control panel
ArcBeev1@reddit (OP)
Correct :)
ArcBeev1@reddit (OP)
This is exactly what I use now too since the update. They now go to 120, so yes this is correct :)
ElegantUmpire9689@reddit
I discovered that in my setup (RTX 4090 and 34WP65C) I can’t enable 10 bpc in the NVIDIA settings, but by lowering the refresh rate from 160Hz to 144Hz I’m able to switch to 10 bpc.
Both_Load_5385@reddit
Thanks for this guide, it really helped me out. I didn't even think the nvidia color settings could make such an impact with HDR but it did
ArcBeev1@reddit (OP)
You're welcome :)