This is an archive of past discussions about Digital Visual Interface. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.
Is this pin used for transmitting EDID?
What could happen if this pin is not available?
Most transmitter systems (computers or video cards) will shut off
the output if the EDID data or the Hot Plug Detect is not there.
This is a function of the software that controls the transmitter.
According to DVI video pinout, C5 is for all analog ground. But in Wikipedia, Pin 15 is for "Return for pin 14 and analog sync"!! So, which pin is for analog sync ground, Pin 15 or C5?
"The long flat pin on a DVI-I connector is longer than the same pin on a DVI-D connector, so it is not possible to connect a male DVI-I to a female DVI-D by removing the 4 analog pins." Why is the C5 pin needed for DVI-D? Is it also for digital RGB ground?? — Preceding unsigned comment added by Totalz (talk • contribs)
Other limitations? I've read that it's limited to 1920x1200 at 60Hz. (True?)
What are its possible successors? (Are they working on some sort of DVI-2?) Some high-end displays today require 2 (or even 4) DVI cables; obviously in a few years when 3+ megapixel displays are common, nobody is going to want to have a lot of cables strewn across their desks to their displays.
About right. I just added a section on Specifications - see the article. As you can see, a system called "dual link" exists. This uses the reserved pins in the standard connector, and therefore fits in a single cable. I don't know if anyone makes graphics cards or monitors using this system. --Heron 17:40, 19 Sep 2004 (UTC)
I think there needs to be a bit more information there.. 1920x1200 at 60Hz is only functional in interlaced mode with single-link, and I think the same is true of 1600x1200@60. I believe the best you can do for non-interlaced video is 1280x1024 (not sure about the refresh rate, though). —User:Mulad(talk) 16:19, May 2, 2005 (UTC)
Bullshit. The limit is 165MHz per link. 1920x1200x60 is 138MHz, which leaves a little headroom for blanking. There is no interlaced mode I'm aware of. --Dtcdthingy03:35, 3 May 2005 (UTC)
Wait a minute, 1920x1200@60Hz should have a higher pixel clock than 1600x1200@60Hz. My math gets me 193 MHz with GTF. Looking closer, the listed 1920x1200@60Hz is not using GTF blanking while 1600x1200@60Hz is. It should be specified what blanking is being used. A 5% blanking would yield 146MHz. So I am guessing that we are using 10% blanking with this measure. Anyone have any idea? — Preceding unsigned comment added by 132.228.195.207 (talk)
dual link
the new apple 30 inch display uses the dual link. It runs at 2560x1600 which looks higher than the highest you wrote. It is made to work with the nvidia 6800 DDL Ultra (check on apple site).
I think that is "dual dual-link" (DDL), which could use two dual-link connectors to run the display, though I may be confused. The card might be so named in order to run two dual-link monitors. —User:Mulad(talk) 16:19, May 2, 2005 (UTC)
Now that many ATI X1300 cards claim to support 2560x1600. Does that mean they are dual link and compatible with the apple 30 inch cinema display? --Chochopk18:40, 8 August 2006 (UTC)
dual link missing?
I see that in the history, that Dual link WAS mentioned in the article's Technical Discussion section, but then mysteriously disapeared. Is there a reason why? I have already added a section on Dual link to the Connector types section, but I'm quite puzzled as to why it disapeared from the article in the first place. CoolFox 04:08, Jun 6, 2005 (UTC)
Pardon? There's a great big paragraph on Dual Link in the Technical Discussion section, and I can't see any evidence in the page history that it was ever deleted since I originally wrote it. I've removed your new section accordingly. --Dtcdthingy 12:17, 6 Jun 2005 (UTC)
Hrm, looks like someone moved my paragraph to to the Technical Discussion... phttt. Oh well, problem solved. CoolFox 14:08, Jun 6, 2005 (UTC)
Metrication
I have temporaly reverted edits by Dtcdthingy until he explains why using metric units before imperial is "idiotic". There is no clear policy on the use of S.I. units for display diagonals, and while the standard in some countries is the inch, this is not always the case. This is the english language wikipedia after all.Thewikipedian 20:30 Jun 6, 2005 (UTC+2)
It's idiotic because the Apple display at least is called the "30-inch Cinema HD Display". There's no such thing as a "76cm Cinema HD Display". I love the metric system and all but its application in this case was totally mindless. --Dtcdthingy20:07, 6 September 2005 (UTC)
I still prefer leaving "76 cm/30 inch cinema display" alone. It keeps the original name for the product while at the same time informing the reader of its equivalent in metric. See [this link] for instance. Perhaps wikipedia should open a new debate on this issue. Thewikipedian 23:19 September 15th, 2005 (UTC+2)
Outside the US, for street language and for marketing purposes, screen sizes are Named in Inches, in both the public and industial domains. 20:21 UTC, 9 Feb 06. 64.58.166.12020:22, 9 February 2006 (UTC)
If it's an ATI card, it might be one of these. The part number for the DVI Y-cable is 04E889, and the part number for the VGA Y-cable is 05E911. — A.M.20:17, 20 June 2006 (UTC)
The DVI specification doesn't say anything about cable length. It depends entirely on the quality of the cable and the data rate of the signal, so there isn't a fixed number, though some examples wouldn't hurt --Dtcdthingy05:09, 1 November 2005 (UTC)
It is helpful to have a guide to cable length, because for DVI it is short. DVI was conceived for connections within one room, meaning up to 3 meters for normally-price cables. However, some newer cables now coming into the market at up to 10 meters in length (incorporating more expensive/lower-loss wire). Extender/Converter Boxes are also available for longer runs. 20:30 UTC, 9 Feb 06. 64.58.166.12020:32, 9 February 2006 (UTC)
On eof the refences says "The official DVI specification mandates that all DVI equipment must maintain a signal at 5 meters (16 feet) in length". RichFarmbrough 17:59 20 June2006 (GMT).
Previous standards were designed for CRT-based devices and thus did not use Discrete time. Some of them were Digital like EGA and CGA or Binary like MDA or analog like VGA.
Whats the difference between digital and binary? AFAIK, today's digital electronics uses two logic levels HIGH and LOW, thus making it same as binary. Multivalued logics are simulated using the two native logic levels which have hardware support. So, the article (2nd line, Overview section) is misleading, I feel. --06:39, 23 February 2006 (UTC)
I think "binary" refers to early computer colour displays where R, G and B were either "on" or "off"; giving just eight possible colours: Black, Red, Green, Blue, Cyan (Green+Blue), Yellow (Red + Green), Magenta (Red + Blue) and White. You are correct to observe that this is just a limiting case of a digital display, however! -- Kim SJ12:47, 13 September 2006 (UTC)
Mini-DVI Pictures
I've just reverted 65.70.89.241's addition of a the Mini-DVI connector and diagram. I'm of the opinion that adding those pictures clutters the article a bit much and, for the real DVI section, is sorta off-topic. But that's just my preference, anybody else feel like opining? — Mobius21:43, 5 August 2006 (UTC)
"No compression is used and DVI has no provision for only transmitting changed parts of the image. This means the whole frame is constantly re-transmitted." -- This is not entirely correct, I believe. At least it could be hinted that in the specs (http://www.ddwg.org/lib/dvi_10.pdf), there's 1.2.2, "Conversion to Selective Refresh", a means to only transfer frame delta. There's obviously nothing resembling a definition yet, but at least it is mentioned and might possibly get extended in future revisions. —The preceding unsigned comment was added by 82.207.195.220 (talk • contribs).
Please feel free to be bold and add this information (and your citation) to the article!
Ok, I did. I also changed the first external link that referred to the Homepage and added the specs as a second link. Please check .-) —The preceding unsigned comment was added by 82.207.195.220 (talk) 20:25, 27 December 2006 (UTC).
Article Picture Label
Should not the main picture for the article be labeled as DVI-I (Dual Link) not as DVI-D (Dual Link)?
Just mislabeled?
Kuba42519:41, 17 February 2007 (UTC)
DVI to legacy VGA converter
It would be useful to include some information about this. Not least because I would be interested to know how they work (I have an LCD monitor but it only has a VGA connector!). It would also be useful for people considering a new graphics card but do not have a DVI compatible monitor. I can provide a photo of an adaptor if it becomes relevant to supply one. Crumbly Biscuits17:21, 23 February 2007 (UTC)
This is covered in the connector section (which I've reworded slightly). The graphics card outputs a full set of VGA signals through extra pins on the DVI connector. A "converter" simply rearranges the pins so you can physically connect a VGA cable. --Dtcdthingy13:40, 25 February 2007 (UTC)
It would also be nice to know about the DVI->S-Video/Composite adapter that Apple sells. (I'm sure others sell similar adapters as well...) Apple sells them for $20, but you can get a DVI->VGA and a VGA->S-Video/Composite adapter for much less. Is it the same thing? How are the pins connected? 131.215.44.23723:02, 3 July 2007 (UTC)
DVI = Digital Video Interface or Digital Visual Interface?
I just read some websites about DVI, they all say DVI = Digital Video Interface and here on Wikipedia they call it Digital Visual Interface. Nice name but that isn't right is it? — Preceding unsigned comment added by 83.117.247.49 (talk)
The specification section should make clear that the clock rates specified are pixel clocks, not bit clocks. This can be confusing to neophytes, especially since there is no discussion of the embedded DVI requirement to have clock recovery on each pixel line (within each pixel clock). It might also be helpful to add a brief discussion of the clock recovery to decode the bits within each pixel (which is highly implementation specific, of course).
66.82.9.5301:04, 30 May 2007 (UTC)
Unclear how DVI signal is helped by using HDMI cables
In the HDMI page it mentions that HDMI cables can carry a DVI signal, via an adapter. Does using the HDMI cable overcome the DVI distance limitation, or are boosters still needed? Does using an HDMI cable have any other effects on a DVI signal? I am curious, but would also find this useful for the articles in question. --Alphastream03:33, 3 November 2007 (UTC)
What the hell is doing this paragraph here?
NOTE - Do not expect a DVI-I to VGA converter to allow you to use a standard VGA monitor. I have a VGA monitor from Hyundai (Q995) and it doesn't work. I emailed them and they say it is standard for VGA monitors (not just theirs) not to function this way. If you are thinking of connecting a monitor this way it is worth checking that it will work or you may find yourself in the same situation as I am - a monitor that I cannot use and that I cannot get a refund for!
OK, it's a pity but that's not encyclopedic in any way!!! Please be serious about editing!!! This is an encyclopaedia not a computer forum!
Excuse me, but I believe the term "discussion" is roughly equivalent to "forum". Derp de der! —The preceding unsigned comment was added by 71.103.73.154 (talk • contribs) .
Wikipedia is not a place to publish your own thoughts and analyses. Please do not use Wikipedia for any of the following:
6. Discussion forums. Please try to stay on the task of creating an encyclopedia. You can chat with folks on their user talk pages, and should resolve problems with articles on the relevant talk pages, but please do not take discussion into articles. There are a number of early-stage projects that attempt to use a wiki for discussion and debate.
A simple adapter should work with most VGA monitors, as long as you have DVI-I (integrated, digital & analog) or DVI-A (analog only). But if you have DVI-D (digital only), the analog signals the VGA needs will not be present. (Of course, there are many reasons for the VGA monitor not to work, such as setting the display resolution/frequencies wrong.)
-69.87.203.131 (talk) 12:07, 15 July 2008 (UTC)
VGA Comparison
The "Overview" sections explains some of the reasons why is better than the VGA connector, however, it's way too technical. A simpler explanation should be added. —Preceding unsigned comment added by T0rek (talk • contribs) 10:18, August 28, 2007 (UTC)
Pin C5, Always horizontal? or sometimes a cross / 'plus sign' shape
It seems, in the comparison of DVI types, that C5 is always horizontal, but I seem to have it in my head that I have seen sockets to accept a plus shape type C5. The Pinout/description also shows C5 keyed as a +. Is anybody able to clarify this, and would it warrant mention in the article? Thanks -- PidGin128 from 65.190.216.161 (talk) 06:48, 17 December 2007 (UTC) .
It would be helpful to have photos and discussion of DVI to 3-RCA adapters. Do they always just carry RGB? What about SYNC? What about Y/pb/Pr weirdness, whatever that is? What exactly do they do? When are they useful? What are the problems with using them? They seem to be commonly sold. Do they work with most TVs?
-69.87.203.131 (talk) 11:56, 15 July 2008 (UTC)
recessed pin
Pin 14 / +5 V / Power for monitor when in standby
The DVI-VGA adapters I just got have one pin (Pin 14) somewhat recessed. Is this normal? Why? The article needs more, better pictures of actual male DVI connectors.
-69.87.199.83 (talk) 01:24, 22 July 2008 (UTC)
sound
I have a video card with onboard sound chip and DVI output (e.g. most AMD HD 2400 cards). If connected to a TV with the DVI-HDMIadapter that came with the card, I get working video as well as audio on the attached TV. Therefore we can infer there is also sound going through DVI in this case. Can someone provide more information (i.e. which pins, type of signal, conforming to which standards). It could also be that this is a custom implementation (e.g. from Sapphire) and some unused pins are just used to get the audio through. —Preceding unsigned comment added by Ecov (talk • contribs) 22:30, 23 July 2008 (UTC)
DVI/HDMI signals are compatible, and the same pins carry video as sound over the TMDS channel for HDMI. From DVI to HDMI is just pin remapping. I'm currently in a lengthy argument with HP tech support over getting my GeForce 9500 to output sound through the HDMI jack by default instead of the DVI port. Needless to say, they're not being very helpful. 71.191.199.243 (talk) 22:30, 11 August 2008 (UTC)
Male connector varieties (DVI_Connector_Types.svg)
I was just bitten by a difference between DVI-D female connectors (on every monitor type I can find) and DVI-I cables, beyond those visible in DVI_Connector_Types.svg. A DVI-I cable will not fit in a DVI-D socket, even after removing the analog R, G, B, and horizsync pins, because the DVI-I "ground" bar is significantly wider than the DVI-D "ground" bar or socket.
Just wanted to let the authors of the article know
This article was very informative. I think its at least a A quality article. It describes so many aspect of this connection type that It deserves to be reviewed for its quality at least. I learned so many things i never knew about the connection format before . JasonHockeyGuy (talk) 05:58, 3 February 2009 (UTC)
Superseded By
I think it is inaccurate to say that this standard has been superseded by Displayport. It is the only connector that is on every graphics card sold, and as far as I know no companies have any intention of dropping support.
Bholstege (talk) 02:41, 13 December 2008 (UTC)
Image caption
Mousing over the image tells you it's a female DVI connector. The caption claims it's male.
So what is it really?
--Tech Nerd (talk) 07:06, 22 June 2008 (UTC)
8B10B was created long ago to ensure enough transitions on the link (keeping in mind DC effect) such as for clock recovery.
TMDS minimizes transitions for a different approach to signal integrity. —Preceding unsigned comment added by Longmontrandall (talk • contribs) 23:32, 4 March 2009 (UTC)
audio over DVI - how? pinout ?
AMD/ATi, nVidia, Intel - all them support sending audion via DVI to HDMI adapter. That means, audio channel is contained within DVI-I ocnnector.
But how ? what is the pinout ? how does it not break compatibility - there seems to be no unused pins in DVI-I connector! —Preceding unsigned comment added by 91.78.12.22 (talk) 19:23, 29 June 2009 (UTC)
DVI does not support Audio. Some DVI to HDMI dongles that are supplied with video cards contain electronics within that dongle that enable audio via HDMI. If you connect a normal monitor via DVI, or use a generic adapter, you will not get audio via the same cable. —Preceding unsigned comment added by 190.95.27.69 (talk) 13:28, 15 July 2009 (UTC)
Yes it does. It is not a matter of pins as it is a matter of signalling. As I understand it, the TMDS signal is just the same as with HDMI. A standard DVI or DVI to HDMI cable contains all the wiring that is necessary for the audio/video TMDS stream. Thus, audio can be supported over DVI as long as the PC or other source device adds audio to the signal, something which these new HDMI video cards are capable of. Note that to get audio you probably need to connect to a HDMI input on your display, probably using a DVI to HDMI adapter or a DVI to HDMI cable, because DVI connections on a display typically do not support audio.
I asked this question before, but it has been moved to the archive, see Talk:Digital_Visual_Interface/Archive_1#sound. According to an answer there, apparently DVI is fully pin compatible with HDMI. To see which DVI variants are pin compatible, or according to which standards, see the HDMI article, under HDMI#Compatibility_with_DVI. Looking at the pinouts, it looks like DVI misses the CEC channel.--Ecov (talk) 12:58, 16 July 2009 (UTC)
Recessed pin?
The archived discussion has an entry ("recessed pin") dated 22 July 2008 that was never addressed, and is the sort of thing which should be covered in the main article. Why is one pin shorter? My guess is so that a turned-off monitor gets a connection to ground before power is connected, thus avoiding the situation where only power is connected and it sees all the OV signals as being negative, which could cause electrical damage. Anyone have a reference for this?