Talk:QR algorithm
If i remember it correctly, in general, the complexity is 2*n^3/3 + O(n^2).
"In this crude form the iterations are relatively expensive. This can be mitigated by first bringing the matrix A to upper Hessenberg form with a finite sequence of orthogonal similarity transforms, much like a QR decomposition. For symmetric matrices this replaces a O(n^3) procedure by one that is O(n^2)."
1) You might want to say somewhere that A0:=A :)
2) So, if A is symmetric => procedure costs O(n^2)? Is that also the case for Hermitean matrices?
"If the original matrix is symmetric, then the Hessenberg matrix is also symmetric and thus tridiagonal, and so are all the Ak. This decreases the computational cost of the algorithm because now each QR decomposition is only O(n), not O(n^2)."
Is this not the same assumption as above, ie A is symmetric? So, when is the cost O(n) and when O(n^2)? 131.155.54.18 13:58, 14 November 2005 (UTC), Wilbert Dijkhof
- Hoi Wilbert. After consulting Golub & Van Loan, I think the costs are as follows. A single QR-decomposition costs 4/3 n^3 flops in the general case (omitting lower order terms). However, QR-decompositions for Hessenberg matrices cost 6n^2 flops while it costs 10/3 n^3 flops to reduce a matrix to Hessenberg form using Householder transformations. If your matrix is symmetric, then the Hessenberg form is in fact tridiagonal. In this case, it costs 4/3 n^3 flops to bring it in tridiagonal form and I think that every QR-decomposition costs O(n) flops, but I'm not sure about that. Warning: I probably have some constants wrong.
- I have no idea how much generalizes to Hermitian matrices. Please do change the article to make it clearer. Groetjes, Jitse Niesen (talk) 15:26, 14 November 2005 (UTC)
- Hoi Jitse. Nog steeds in Engeland? I clarified some things a bit, especially about the computational complexity. I also used Golub & Van Loan as source. I hope it's good now. Wilbert Dijkhof, 131.155.54.18 14:55, 16 November 2005 (UTC)