Talk:Edge detection
![]() | Robotics C‑class Mid‑importance | |||||||||
|
![]() | Computing B‑class | |||||||||
|
![]() | Computer science B‑class Low‑importance | ||||||||||||||||
|
In the practice of digital image enhancement, basing edge detection merely
on numerical derivatives is too naive, and unrealistic. For each pixel of
a digital image, one wants not only to decide if it is a candidate for membership in an "edge" but also to find the direction of that edge.
[In particular, the edge direction is required for true sharpening].
One needs to analize a suitable collection of neighboring pixels (typically, those at horizontal and vertical distances up to 3) with respect
to intensity as well as position. Although effective methods of doing this
are not very difficult to develop, it seems that commercial software
does not provide truly suitable implementations.
Response to unsigned criticism above: Well, edge detection based on image derivatives is not fully naive, subject to the well-known practice of using Gaussian filtering as a pre-processing stage to the computation of image derivatives. This means that the effective support region for image derivative computations are equal to the support regions of first-order Gaussian derivative operators, and thus substantially larger than a distance of three pixels. Moreover, the orientation of an edge within the differential approach to edge detection is given as orthogonal to the orientation of the image gradient as estimated by first-order Gaussian derivative operators. In practice, these approaches have found numerous successful applications in computer vision, however, usually with different goals than mere image enhancement. Tpl
Add some "why"?
Lot's of technical "how" but not a lot of explanatory "why" for us non-techies. Just a little would be nice Awotter 23:17, 29 October 2007 (UTC)
Major restructuring of this article
Following the tag marked in October 2007, I have now made a first attempt to restructure this article to be more updated with respect to the topic of edge detection and also to give more technical details of basic edge detectors. Question to those of you who have tagged this article, do you find it appropriate to remove the tag? Tpl (talk) 16:40, 22 February 2008 (UTC)
Atomic line filter
Why is there a link to "atomic line filter" in the "See Also" section? I skimmed through the article, and I don't see that it really has anything at all to do with edge or line detection. Anyone? 65.183.135.231 (talk) 18:03, 18 July 2008 (UTC) I think that link should be removed since it deals with spectroscopy and not image processing. Line detection is closely related to edge detection and so should be included in this article. I'll try to add something soon. Zen-in (talk) 21:29, 4 June 2009 (UTC)
I agree that the link to atomic line filter is not suitable. Regarding the topic of "line detection", there is an article on ridge detection on this topic. For detecting straight lines or circles, there is also an article on the Hough transform. Tpl (talk) 06:29, 5 June 2009 (UTC)
Trivial
The "Why edge detection is a non-trivial task" section is a total failure. It shows a trivial example and then says "but most of it is not this trivial". Come on! Give a non-trivial example! Obviously!! Don't lead off with an easy one and then just wave your hands around saying that there are others that are harder. That's not explanation! --98.217.8.46 (talk) 17:25, 5 September 2008 (UTC)
- I agree that the text can be misunderstood if not read with a positive spirit. Now, I have added an additional sentence to explain where the problem is. Probably a better illustration would help, but the current illustration is used for of historical reasons (because it is available). If you know how to set the grey-levels in the presumable hexadecimal (?) notation used by previous authors, please help to generate a better illustration Tpl (talk) 14:54, 6 September 2008 (UTC)
Steps to detect an edge map 1- get the original image
a=imread('rice.png');
2- define the x and y gradient operators(such as prewitt or sobel):
xo=[-1 0 1;-1 0 1;-1 0 1]; yo=[-1 -1 -1,0 0 0;1 1 1];
3-smooth the image and get the x and y components of the gradient by correlating the original image by each operator:
gx=filter2(xo,a); gy=filter2(yo,a);
4- compute the gradient magnitude:
g=sqrt(gx.*gx+gy.*gy);
5- choose a threshold value t and convert the gray edge to binary(for example t=0.3)
gb=im2bw(g/255,t)
6- show the edge map
figure,imshow(gb);
with my wishes Dr. Ziad Alqadi Jordan amman Albalqa unversity Faculty of engineering technology
Out of touch
The article seems to have been written by a group of pompous intellectuals who are out of touch with their readers. Hyperlinking to articles like "shading" as though we've never heard of it, then just mentioning formulas without explaining them properly... The purpose of an encyclopaedia is to teach people about detailed areas of interest. I've studied computer vision myself at university and I find most of this article to be too vague, intellectual and just unhelpful. Specifically, the most notable things are that the explanation of non-maximum suppression is unhelpful and the illustration with the grey pixels should have the rows labelled (it's unclear for a while as to what the numbers mean) and it's not even necessary anyway - you could just refer to the image of the girl.Owen214 (talk) 10:02, 16 June 2010 (UTC)
- WikiProject templates with unknown parameters
- C-Class Robotics articles
- Mid-importance Robotics articles
- WikiProject Robotics articles
- B-Class Computing articles
- Unknown-importance Computing articles
- All Computing articles
- B-Class Computer science articles
- Low-importance Computer science articles
- WikiProject Computer science articles