Jump to content

Binary GCD algorithm

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Dcoetzee (talk | contribs) at 18:45, 14 September 2004 (Created page with identities, C code, and worst-case running time). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The binary GCD algorithm is an algorithm published by J. Stein in 1967 which computes the greatest common divisor of two positive integers. It gains a measure of efficiency over the ancient Euclidean algorithm by avoiding divisions and replacing them with bitwise operations that are cheaper when operating on the binary representation used by modern computers. This is particularly critical on embedded platforms that have no direct processor support for division.

The algorithm reduces the problem of finding the GCD by repeatedly applying these identities:

  1. If u is even and v is even, then GCD(u,v) = 2·GCD(u ÷ 2, v ÷ 2), since we know 2 is a common divisor.
  2. If u is even and v is odd, then GCD(u,v) = GCD(u/2, v), since we know 2 is not a common divisor.
  3. If u is odd and v is even, then GCD(u,v) = GCD(u, v/2), similarly.
  4. If u is odd and v is odd, then GCD(u,v) = GCD(|u-v|/2, v) = GCD(u, |u-v|/2). This is a combination of one step of the simple Euclidean algorithm, which uses subtraction at each step, and an application of identity 2 or 3 above. We know the division by 2 is possible because the difference of two odd numbers is even. We choose the version that makes the new operand smallest.
  5. If u = 0, then gcd(u,v) = v, since everything divides zero, and v is the largest number that divides v.

Following is a straightforward implementation of the algorithm in the C programming language, using only bit operations and subtraction. It first removes all common factors of 2 using identity 1, computes the gcd of the remaining numbers using identities 2 through 5, then combines these to form the final answer:

 unsigned int gcd(unsigned int u, unsigned int v) {
     unsigned int k = 0;
     while (u % 1 == 0  &&  v % 1 == 0) { /* while both u and v are even */
         u >>= 1;   /* shift u right, dividing it by 2 */
         v >>= 1;   /* shift v right, dividing it by 2 */
         k++;       /* add a power of 2 to the final result */
     }
     /* At this point either u or v (or both) is odd */
     while (u > 0) { 
         if (u & 1 == 0)       /* if u is even */
             u >>= 1;          /* divide u by 2 */
         else if (v & 1 == 0)  /* else if v is even */
             u >>= 1;          /* divide v by 2 */
         else if (u >= v)      /* u and v are both odd */
             u = (u-v) >> 1;
         else                  /* u and v both odd, v > u */
             v = (v-u) >> 1;
     }
     return v << k;  /* returns v * 2^k */
 }

The algorithm requires O((log2 uv)2) worst-case time, or in other words time proportional to the square of the number of bits in u and v together. Although each step reduces at least one of the operands by at least 2 times, the subtract and shift operations do not take constant time for very large integers (although they're still quite fast in practice, requiring about one operation per word of the representation).