This is an archive of past discussions about Digital Visual Interface. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.
Is this pin used for transmitting EDID?
What could happen if this pin is not available?
Most transmitter systems (computers or video cards) will shut off
the output if the EDID data or the Hot Plug Detect is not there.
This is a function of the software that controls the transmitter.
According to DVI video pinout, C5 is for all analog ground. But in Wikipedia, Pin 15 is for "Return for pin 14 and analog sync"!! So, which pin is for analog sync ground, Pin 15 or C5?
"The long flat pin on a DVI-I connector is longer than the same pin on a DVI-D connector, so it is not possible to connect a male DVI-I to a female DVI-D by removing the 4 analog pins." Why is the C5 pin needed for DVI-D? Is it also for digital RGB ground?? — Preceding unsigned comment added by Totalz (talk • contribs)
Other limitations? I've read that it's limited to 1920x1200 at 60Hz. (True?)
What are its possible successors? (Are they working on some sort of DVI-2?) Some high-end displays today require 2 (or even 4) DVI cables; obviously in a few years when 3+ megapixel displays are common, nobody is going to want to have a lot of cables strewn across their desks to their displays.
About right. I just added a section on Specifications - see the article. As you can see, a system called "dual link" exists. This uses the reserved pins in the standard connector, and therefore fits in a single cable. I don't know if anyone makes graphics cards or monitors using this system. --Heron 17:40, 19 Sep 2004 (UTC)
I think there needs to be a bit more information there.. 1920x1200 at 60Hz is only functional in interlaced mode with single-link, and I think the same is true of 1600x1200@60. I believe the best you can do for non-interlaced video is 1280x1024 (not sure about the refresh rate, though). —User:Mulad(talk) 16:19, May 2, 2005 (UTC)
Bullshit. The limit is 165MHz per link. 1920x1200x60 is 138MHz, which leaves a little headroom for blanking. There is no interlaced mode I'm aware of. --Dtcdthingy03:35, 3 May 2005 (UTC)
Wait a minute, 1920x1200@60Hz should have a higher pixel clock than 1600x1200@60Hz. My math gets me 193 MHz with GTF. Looking closer, the listed 1920x1200@60Hz is not using GTF blanking while 1600x1200@60Hz is. It should be specified what blanking is being used. A 5% blanking would yield 146MHz. So I am guessing that we are using 10% blanking with this measure. Anyone have any idea? — Preceding unsigned comment added by 132.228.195.207 (talk • contribs)
dual link
the new apple 30 inch display uses the dual link. It runs at 2560x1600 which looks higher than the highest you wrote. It is made to work with the nvidia 6800 DDL Ultra (check on apple site).
I think that is "dual dual-link" (DDL), which could use two dual-link connectors to run the display, though I may be confused. The card might be so named in order to run two dual-link monitors. —User:Mulad(talk) 16:19, May 2, 2005 (UTC)
Now that many ATI X1300 cards claim to support 2560x1600. Does that mean they are dual link and compatible with the apple 30 inch cinema display? --Chochopk18:40, 8 August 2006 (UTC)
dual link missing?
I see that in the history, that Dual link WAS mentioned in the article's Technical Discussion section, but then mysteriously disapeared. Is there a reason why? I have already added a section on Dual link to the Connector types section, but I'm quite puzzled as to why it disapeared from the article in the first place. CoolFox 04:08, Jun 6, 2005 (UTC)
Pardon? There's a great big paragraph on Dual Link in the Technical Discussion section, and I can't see any evidence in the page history that it was ever deleted since I originally wrote it. I've removed your new section accordingly. --Dtcdthingy 12:17, 6 Jun 2005 (UTC)
Hrm, looks like someone moved my paragraph to to the Technical Discussion... phttt. Oh well, problem solved. CoolFox 14:08, Jun 6, 2005 (UTC)
Metrication
I have temporaly reverted edits by Dtcdthingy until he explains why using metric units before imperial is "idiotic". There is no clear policy on the use of S.I. units for display diagonals, and while the standard in some countries is the inch, this is not always the case. This is the english language wikipedia after all.Thewikipedian 20:30 Jun 6, 2005 (UTC+2)
It's idiotic because the Apple display at least is called the "30-inch Cinema HD Display". There's no such thing as a "76cm Cinema HD Display". I love the metric system and all but its application in this case was totally mindless. --Dtcdthingy20:07, 6 September 2005 (UTC)
I still prefer leaving "76 cm/30 inch cinema display" alone. It keeps the original name for the product while at the same time informing the reader of its equivalent in metric. See [this link] for instance. Perhaps wikipedia should open a new debate on this issue. Thewikipedian 23:19 September 15th, 2005 (UTC+2)
Outside the US, for street language and for marketing purposes, screen sizes are Named in Inches, in both the public and industial domains. 20:21 UTC, 9 Feb 06. 64.58.166.12020:22, 9 February 2006 (UTC)
If it's an ATI card, it might be one of these. The part number for the DVI Y-cable is 04E889, and the part number for the VGA Y-cable is 05E911. — A.M.20:17, 20 June 2006 (UTC)
The DVI specification doesn't say anything about cable length. It depends entirely on the quality of the cable and the data rate of the signal, so there isn't a fixed number, though some examples wouldn't hurt --Dtcdthingy05:09, 1 November 2005 (UTC)
It is helpful to have a guide to cable length, because for DVI it is short. DVI was conceived for connections within one room, meaning up to 3 meters for normally-price cables. However, some newer cables now coming into the market at up to 10 meters in length (incorporating more expensive/lower-loss wire). Extender/Converter Boxes are also available for longer runs. 20:30 UTC, 9 Feb 06. 64.58.166.12020:32, 9 February 2006 (UTC)
On eof the refences says "The official DVI specification mandates that all DVI equipment must maintain a signal at 5 meters (16 feet) in length". RichFarmbrough 17:59 20 June2006 (GMT).
Previous standards were designed for CRT-based devices and thus did not use Discrete time. Some of them were Digital like EGA and CGA or Binary like MDA or analog like VGA.
Whats the difference between digital and binary? AFAIK, today's digital electronics uses two logic levels HIGH and LOW, thus making it same as binary. Multivalued logics are simulated using the two native logic levels which have hardware support. So, the article (2nd line, Overview section) is misleading, I feel. --06:39, 23 February 2006 (UTC)
I think "binary" refers to early computer colour displays where R, G and B were either "on" or "off"; giving just eight possible colours: Black, Red, Green, Blue, Cyan (Green+Blue), Yellow (Red + Green), Magenta (Red + Blue) and White. You are correct to observe that this is just a limiting case of a digital display, however! -- Kim SJ12:47, 13 September 2006 (UTC)
Mini-DVI Pictures
I've just reverted 65.70.89.241's addition of a the Mini-DVI connector and diagram. I'm of the opinion that adding those pictures clutters the article a bit much and, for the real DVI section, is sorta off-topic. But that's just my preference, anybody else feel like opining? — Mobius21:43, 5 August 2006 (UTC)
Both red links are not explained. Can someone please explain what LCD Blanking is?
— Preceding unsigned comment added by 74.134.146.52 (talk • contribs)
"No compression is used and DVI has no provision for only transmitting changed parts of the image. This means the whole frame is constantly re-transmitted." -- This is not entirely correct, I believe. At least it could be hinted that in the specs (http://www.ddwg.org/lib/dvi_10.pdf), there's 1.2.2, "Conversion to Selective Refresh", a means to only transfer frame delta. There's obviously nothing resembling a definition yet, but at least it is mentioned and might possibly get extended in future revisions. —The preceding unsigned comment was added by 82.207.195.220 (talk • contribs).
Please feel free to be bold and add this information (and your citation) to the article!
Ok, I did. I also changed the first external link that referred to the Homepage and added the specs as a second link. Please check .-) —The preceding unsigned comment was added by 82.207.195.220 (talk) 20:25, 27 December 2006 (UTC).
Article Picture Label
Should not the main picture for the article be labeled as DVI-I (Dual Link) not as DVI-D (Dual Link)?
Just mislabeled?
Kuba42519:41, 17 February 2007 (UTC)
DVI to legacy VGA converter
It would be useful to include some information about this. Not least because I would be interested to know how they work (I have an LCD monitor but it only has a VGA connector!). It would also be useful for people considering a new graphics card but do not have a DVI compatible monitor. I can provide a photo of an adaptor if it becomes relevant to supply one. Crumbly Biscuits17:21, 23 February 2007 (UTC)
This is covered in the connector section (which I've reworded slightly). The graphics card outputs a full set of VGA signals through extra pins on the DVI connector. A "converter" simply rearranges the pins so you can physically connect a VGA cable. --Dtcdthingy13:40, 25 February 2007 (UTC)
It would also be nice to know about the DVI->S-Video/Composite adapter that Apple sells. (I'm sure others sell similar adapters as well...) Apple sells them for $20, but you can get a DVI->VGA and a VGA->S-Video/Composite adapter for much less. Is it the same thing? How are the pins connected? 131.215.44.23723:02, 3 July 2007 (UTC)
DVI = Digital Video Interface or Digital Visual Interface?
I just read some websites about DVI, they all say DVI = Digital Video Interface and here on Wikipedia they call it Digital Visual Interface. Nice name but that isn't right is it? — Preceding unsigned comment added by 83.117.247.49 (talk • contribs)
The specification section should make clear that the clock rates specified are pixel clocks, not bit clocks. This can be confusing to neophytes, especially since there is no discussion of the embedded DVI requirement to have clock recovery on each pixel line (within each pixel clock). It might also be helpful to add a brief discussion of the clock recovery to decode the bits within each pixel (which is highly implementation specific, of course).
66.82.9.5301:04, 30 May 2007 (UTC)
Unclear how DVI signal is helped by using HDMI cables
In the HDMI page it mentions that HDMI cables can carry a DVI signal, via an adapter. Does using the HDMI cable overcome the DVI distance limitation, or are boosters still needed? Does using an HDMI cable have any other effects on a DVI signal? I am curious, but would also find this useful for the articles in question. --Alphastream03:33, 3 November 2007 (UTC)