Talk:Fifth Generation Computer Systems
NPOV?
yonot disputing the conclusions, but some of the language in this article doesn't seem exactly NPOV. --Chronodm 12:21, 13 March 2006 (UTC)
link
i don't know if this is intended, but the follow-the-leader link directs to the page on the marching band move.Tobias087 01:22, 11 April 2006 (UTC)
NPOV
The POV is slightly less that neutral. There are subtle cultural problems, and a degree of revisionism inherent in what has been written here. I can tell you that workstations were an afterthought in the period between 1982-1984. Architectural decisions in Japan at the time were more slant toward mainframes. The article is missing 2 of the 3 basic key ideas Moto-Oka and others took away with some the FCGS conference: 1) Data flow (Jack Dennis' ideas), 2) Prolog (which the article has, and I am not clear with Warren was at the meeting like Jack was), and I think,yeah man 3) Knowledge bases (Feigenbaum?). Those were starting ideas, but Moto-Oka's book should be examined for these. These were somewhat alarming at the time to USA-ians. Also the project should also not be confused with a similar Japanese superspeed project (Science article). I am not clear where I would begin editing minus Moto-Oka's book. --enm ~21:00, 8 Aug 2006 (UTC)
Page move
The first paragraph seems that the redirects and page title are wrong here:
- The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware).
This implies that this page should be called "Fifth Generation Computer Systems project" or something similiar. It seems to indicate that "Fifth generation computer" should point to "history of computing hardware". In fact, the other "nth generation computer" articles already redirect to that article. Any thoughts? -- ShinmaWa(talk) 00:07, 21 August 2006 (UTC)
Anachronisms?
The "failure" chapter reads: the internet enabled locally stored databases to become distributed; even simple research projects provided better real-world results in data mining, Google being a good example
This is an anachronism. Google was founded sixteen years after the start of the FGP. Are there any contemporary examples of simple research project outperforming the FPG in their own field? Was the ability of other system to distribute databases relevant if the FPG databases were anyway custom made for the task and not something you stored sales records in? Further down it reads:
The workstations had no appeal in a market where single-CPU systems could outrun them, and the entire concept was overtaken by the Internet.
Massive parallell computing on workstations connected to the internet didn't have its breakthrough until the release of Seti@Home in 1999. Before that, supercomputers with a large number of CPUs working in parallell were the norm. An example contemporary to the FGP was the "supercomputer" Cray-2. In light of that, "overtaken by the internet" is an anachronism. In any case, the AI software and related concepts got scrapped as well, nobody, as far as I know, ported similar systems to individual computers hooked up in a network in the late 80's-early 90's. EverGreg (talk) 14:53, 31 March 2008 (UTC)