Jump to content

Talk:Fibonacci coding

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Ravenswood (talk | contribs) at 21:40, 28 April 2005. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Big-endian and little-endian?

Can anyone substantiate the premise that there are two separate incompatible systems, big-endian and little-endian, for encoding integers as the sum of Fibonacci numbers? I can find no verification of this and would like to see some before we change the page entirely to reflect this idea. -- Antaeus Feldspar 21:08, 3 Feb 2005 (UTC)

Big-endian just means that the bits are arranged in least-to-most significant order. Little-endian means that they are sorted most-to-least. Given the description of the code here, I would assume that big-endian is the only possibility. Ravenswood 21:39, 28 Apr 2005 (UTC)