Jump to content

Talk:18-bit computing

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Comp.arch (talk | contribs) at 12:00, 25 July 2014 (Size of char/byte in these systems (first UNIX): new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
WikiProject iconComputing Stub‑class High‑importance
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StubThis article has been rated as Stub-class on Wikipedia's content assessment scale.
HighThis article has been rated as High-importance on the project's importance scale.

Size of char/byte in these systems (first UNIX)

18-bit is the word size. "Two bytes + 2 kiddies"[1], probably meant just for size comparison. But byte can actually be be 18/2=9 bits in this schema. Or 18/3=6 bits. As PDP-7 was 18-bit and first Unix machine I wander what they did? I think Unix has always used at least 7-bit ASCII from C-era UNIX, but first version is pre-C assembly. Yes, C requires byte to be at least 8-bit in C99, but was 6-bit allowed at some point? comp.arch (talk) 12:00, 25 July 2014 (UTC)[reply]