Jump to content

Block Wiedemann algorithm

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Fivemack (talk | contribs) at 19:59, 12 June 2008 (carry on writing ...). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The Block Wiedemann algorithm for computing the nullspace of a matrix over a finite field is a generalisation of an algorithm due to Don Coppersmith.

Coppersmith's algorithm

Let M be an nxn square matrix over some finite field F, and let x be a random vector of length n. Consider the sequence obtained by repeatedly multiplying the vector by the matrix M; let y be any other vector of length n, and consider the sequence of finite-field elements y.x, y.Mx, y.M^2x ...

We know that the matrix M has a minimal polynomial; by the Cayley-Hamilton Theorem we know that this polynomial is of degree (which we will call ) no more than n. Say . Then ; so the minimal polynomial of the matrix annihilates the sequence S.

But the Berlekamp-Massey algorithm allows us to calculate relatively efficiently some polynomial with .

== The Block Wiedemann algorithm

The natural implementation of sparse matrix arithmetic on a computer makes it easy to compute the sequence S in parallel for a number of vectors equal to the width of a machine word - indeed, it will normally take no longer to compute for that many vectors than for one. If you have several processors, you can compute the sequence S for a different set of random vectors in parallel on all the computers.

It turns out, by a generalisation of the Berlekamp-Massey algorithm to provide a sequence of small matrices, that you can take the sequence produced for a large number of vectors and generate a kernel vector of the original large matrix. You need to compute for some where need to satisfy and are a series of vectors of length n; but in practice you can take as a sequence of unit vectors and simply write out the first entries in your vectors at each time t.