Custom Resolution Utility (CRU)
|
12-23-2015, 03:33 PM
Post: #1881
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
First of all: Thanks for this great tool.
Now to my question: What exactly does that 'HDMI support' in the EDID extension block do to the video signaling? Here's my problem: I'm trying to pass HDMI audio to an A/V receiver with the help of an HDMI splitter. I've the following hardware: A NVIDIA Geforce GTX 980Ti connected to 3 displays (DELL U3011 via DP @2560x1600x60Hz and two DELL 2007FP via DVI/HDMI @1600x1200x60Hz). To get the audio signal over HDMI to the receiver, I connected one of the 2007FPs to a HDMI splitter via a HDMI-to-DVI cable and the second output of that splitter to the receiver. Now I choose 1600x1200x60Hz as the resolution for that display in Windows, but the displays keeps saying that I should choose the right resolution (which is 1600x1200x60Hz). So I figured CRU might help with a possibly wrong EDID due to the use of said splitter. I configured a standard resolution exactly as the EDID info on the 2007FP says and configured an extension block with only 'Audio formats'. This works as the display is showing a native resolution of 1600x1200 in windows and does display the windows desktop, but the HDMI audio device for that display is disabled and cannot be used - so no audio here. If I additionally add the 'HDMI support' to the extension block, everything is the same (reported native resolution in windows and everything) and this time, the HDMI sound device comes active and does play sound over the receiver. Unfortunately now the display doesn't show anything other than a message saying I should switch to a supported resolution (1600x1200x60Hz). Something seems to tell the graphics driver to mess with the output signal if 'HDMI support' is enabled so that the DELL 2007FP couldn't detect the signal as valid?! Any ideas? |
|||
12-23-2015, 04:03 PM
(Last edited: 12-24-2015, 03:09 AM by zamar19)
Post: #1882
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
@Breit
In my case, a passive DP-to-HDMI adapter caused GPU driver to limit DP port bandwidth twice lower, but using AMD Patcher fixed that (its reversible). It may be the case with your (passive?) HDMI1.4 Splitter, if it doesn't allow devices to share the GPU pixel clock. Or may be the passive splitter setting lower bandwidth & resolution per device to feed both from the same source? Did you try to run NVIDIA Patcher and reboot to see if it helps? |
|||
12-23-2015, 04:07 PM
Post: #1883
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-23-2015 04:03 PM)zamar19 Wrote: @Breit No I didn't tried that yet, maybe worth a shot. Thanks. Although with the cabling and everything just the same (including the HDMI splitter), but without 'HDMI support' in the EDID extension block it works. So this bandwidth limiter must be bound to that?! |
|||
12-23-2015, 04:27 PM
(Last edited: 12-23-2015, 04:31 PM by zamar19)
Post: #1884
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
If everything works through the HDMI splitter including audio to your receiver and video to the monitor at max reso, why do you need any Virtual EDID changes?
|
|||
12-23-2015, 06:47 PM
Post: #1885
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-23-2015 04:27 PM)zamar19 Wrote: If everything works through the HDMI splitter including audio to your receiver and video to the monitor at max reso, why do you need any Virtual EDID changes? Sorry if that was misleading, but with 'everything works' I mean the video part works, as I mentioned in my initial post. It is either video or audio it seems, depending on the presence of this 'HDMI support' flag in the EDID extension block. |
|||
12-24-2015, 04:08 AM
Post: #1886
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-23-2015 03:33 PM)Breit Wrote: Now to my question: What exactly does that 'HDMI support' in the EDID extension block do to the video signaling?HDMI support tells the graphics driver that the display supports HDMI signals. Without HDMI support, HDMI works like single-link DVI. HDMI support is required for audio. DVI does not carry audio data. (12-23-2015 03:33 PM)Breit Wrote: Something seems to tell the graphics driver to mess with the output signal if 'HDMI support' is enabled so that the DELL 2007FP couldn't detect the signal as valid?!HDMI displays can normally handle DVI signals, but there's no guarantee that DVI displays can handle HDMI signals. Why not plug the receiver directly into the video card? If you need to use a monitor, use the Dell U3011 instead. It can handle 2560x1600 @ 60 Hz with HDMI, but you'll need to add a custom resolution. |
|||
12-24-2015, 05:12 AM
Post: #1887
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-23-2015 05:28 AM)zamar19 Wrote: Thanks again. I'll follow your advice. Why YCbCr 4:4:4 has a better chance to show 4:4:4 than FullRGB 4:4:4, despite Full & Limited RGB are the only options in FirePro CC for 4K reso, while YCbCr 4:4:4 options also show up for lower reso? Does it require lower bandwidth - your Color Correction article doesn't mention it? See this monitor EDID below.That's not my article. RGB 4:4:4 should work, but the TV might be converting it to YCbCr 4:2:2 internally. YCbCr 4:4:4 uses the same amount of bandwidth as RGB 4:4:4. The EDID shows YCbCr 4:4:4 support is already enabled in the default extension block. If you can't select it at 3840x2160, then that's a driver or hardware limitation. (12-23-2015 05:28 AM)zamar19 Wrote: How adding 4:4:4 output and 10-bit color would add to required bandwidth? Will it still fit into ~400 Mhz limit, or these specs don't affect it much? FirePro CC shows DPF choice 8, 10, 12bpc for the monitor at lower reso, but only 8bpc for 4K@30Hz, so is it restricted by the DP-to-HDMI adapter bandwidth? Would it allow switching to 4K 10-bit via Dual DVI?I don't know where you got 400 MHz from. AMD cards are normally limited to 297 MHz, which is just enough for 3840x2160 @ 30 Hz. That's assuming 4:4:4 output and 8-bit color. 10-bit color would require more bandwidth, so that would only be available at lower resolutions. (12-23-2015 05:28 AM)zamar19 Wrote: In Radeon cards, if only DFP CD options are shown, but no GPU 10-bit option, then the driver won't send a 10-bit signal regardless what signal you select for the monitor to accept. Does CRU allow to activate the GPU option in Radeon, not only DFP option? In FirePro I can set it now to 10-bit pixel out, but the display blacks out (not implemented in current Win 10 driver & insufficient bandwidth?).That's a FirePro feature. That option is not available with Radeon cards. (12-23-2015 05:28 AM)zamar19 Wrote: Also, how setting Wide Gamut Profile or ICC in Windows Color Management - Advanced - Device Profile and CRU may add into this? This monitor 10-bit panel supports Color Gamut & Saturation Level 72% of NTSC CRT level, so it seems qualified as Standard rather than Wide Gamut RGB. How much extra bandwidth requires Wide Gamut to pass wider color saturation values range from GPU? May be none, if each pixel saturation is passed as one value? I assume, Wide Gamut option would appear in FirePro CC Pixel Format settings if supported by the monitor, or does one need to add it in Custom extension block? Should it be supported by the video card and driver too, and what would confirm such support? Would using YCbCr 4:4:4 limit the monitor Color Gamut range compare to FullRGB 4:4:4?Gamut has nothing to do with the pixel format or color depth. Gamut is a physical characteristic that describes how the display outputs color. It's not part of the pixel data. Color profiles are not installed automatically. Windows always assumes sRGB. You have to install a color profile or create one using a calibration device. (12-23-2015 05:28 AM)zamar19 Wrote: Do you plan to add an ICC Profile for this monitor? Its gettings quite popular due to low cost. Pls consider adding more Color Gamut info to this ICC article, since most users wouldn't know the difference btw Color Depth (pixel bitness affecting number of intensity shades per reproducible color) and Color Gamut (measuring min and max saturation intensity of each color, and affecting saturation level of each color shade), and how visible on a given monitor Gamut is affected not only by ICC, but GPU color settings and available ports bandwidth.That's not my site. |
|||
12-24-2015, 12:01 PM
Post: #1888
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-24-2015 04:08 AM)ToastyX Wrote: HDMI displays can normally handle DVI signals, but there's no guarantee that DVI displays can handle HDMI signals.So if I understand that correctly, then my 2007FPs aren't capable of extracting the video signal out of a full HDMI signal with audio? I was under the impression that a HDMI-to-DVI cable already got rid of the audio part since DVI cannot support audio. This is obviously not the case then and despite the DVI connector, the signal is still a complete HDMI signal including audio?! Thanks for clarifying that. (12-24-2015 04:08 AM)ToastyX Wrote: Why not plug the receiver directly into the video card? If you need to use a monitor, use the Dell U3011 instead. It can handle 2560x1600 @ 60 Hz with HDMI, but you'll need to add a custom resolution. There are two problems with that: To be able to enable the HDMI audio device, there must be an active display on that port which is also part of the windows desktop and it is very awkward and unconvenient to have some off-screen area on the windows desktop which you don't see (mouse not visible while on that off-screen area; windows accidentally moved there or opened there; etc.). You are right that the DELL U3011 infact does support HDMI (but only up to 1920x1080); it does even support audio over DisplayPort with the full resolution of 2560x1600, but I don't see a way to get that audio to the receiver. You suggest to connect the U3011 via HDMI (and splitter) and add a custom resolution of 2560x1600x60Hz to that? I'm not quite sure this works because 2560x1600 is nearly double the amount of pixels than 1920x1080 and thus double the bandwidth. For now I've found a solution that kinda works: I connected the receiver separately to the GPU and simply mirrored one of the 2007FPs desktop part to that output. This way the GPU scaled the 1600x1200 (portrait) video signal to a 1920x1080 (landscape) signal and the HDMI audio device becomes active in windows. Only problem is, that the stupid NVIDIA driver keeps the GPU at 3D-clocks with that and I need to use NVIDIA Inspector with it's 'Multi Display Power Saver' feature to force idle clocks. |
|||
12-24-2015, 12:59 PM
Post: #1889
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
(12-24-2015 12:01 PM)Breit Wrote: So if I understand that correctly, then my 2007FPs aren't capable of extracting the video signal out of a full HDMI signal with audio? I was under the impression that a HDMI-to-DVI cable already got rid of the audio part since DVI cannot support audio. This is obviously not the case then and despite the DVI connector, the signal is still a complete HDMI signal including audio?! Thanks for clarifying that.HDMI is physically the same as single-link DVI. The signal determines whether it's HDMI or DVI. You can use the DVI ports as HDMI ports with most NVIDIA cards. The audio is embedded in the video signal during the blanking periods, so DVI monitors might not sync correctly with that data present. (12-24-2015 12:01 PM)Breit Wrote: You suggest to connect the U3011 via HDMI (and splitter) and add a custom resolution of 2560x1600x60Hz to that? I'm not quite sure this works because 2560x1600 is nearly double the amount of pixels than 1920x1080 and thus double the bandwidth.The U3011 and GTX 980 Ti can both handle 2560x1600 @ 60 Hz with HDMI. The only question is whether the splitter and receiver can handle higher pixel clocks. They should if they can handle UHD/4K resolutions because 3840x2160 @ 30 Hz uses more bandwidth. (12-24-2015 12:01 PM)Breit Wrote: Only problem is, that the stupid NVIDIA driver keeps the GPU at 3D-clocks with that and I need to use NVIDIA Inspector with it's 'Multi Display Power Saver' feature to force idle clocks.That wasn't happening before? It's normally not possible for the memory clock to change with multiple displays if the resolutions don't match exactly. |
|||
12-24-2015, 03:58 PM
(Last edited: 12-25-2015, 05:07 AM by zamar19)
Post: #1890
|
|||
|
|||
RE: Custom Resolution Utility (CRU)
@ToastyX
Quote:AMD cards are normally limited to 297 MHz, which is just enough for 3840x2160 @ 30 Hz. That's assuming 4:4:4 output and 8-bit color. So why 4:4:4 test images show 4:2:2 on this monitor at 4K resolution, despite included in its EDID? Can it be DP-to-HDMI1.4 4K adapter chipset limitation? Actually, most current Radeon and FirePro cards support one or multiple 4K@60Hz out via DP1.2, of course older cards only support 4K@30Hz via DP1.0 or HDMI1.4. To my understanding, Nvidia Quadro cards and drivers also have 10-bit pixel out support option, but other Nvidia models don't, despite offering several color depth options in Nvidia CC DFP Properties. I tried adding in CRU Custom Extension Block - HDMI Profile, but it only offers YCbCr 4:4:4, but no FullRGB 4:4:4. For this monitor, YCbCr 4:4:4 option is not available at 4K@30Hz in FirePro CC for my GPU, only FullRGB 4:4:4 out, which is default color space for the monitor. Why no FullRGB 4:4:4 option in Extension? I exported EDID override inf in CRU and installed to Win 10 64-bit with Driver Sig Enf. Off, then rebooted. I can see in Reg: SEK0030/key_name/Device Parameters (EDID)/EDID_OVERRIDE (0,1). But it looks like Win 10 only reads main EDID section, and doesn't read 0 and 1 overrides. If I replace EDID key with 0 and 1 keys, OS recreates it on reboot. I was trying the Override to get 4:4:4 chromo at 4K@30Hz, since its not listed in main EDID, but the pic test still shows 4:2:2. So it looks like either Win 10, or latest FirePro driver, or both can't handle EDID Overrides. The driver just reads and saves to Registry main EDID from EEPROM without the extension, despite its part of the factory EDID. Why this happens, and how to fix? |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 101 Guest(s)