Jump to content

Hardware for artificial intelligence

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by CharlesGillingham (talk | contribs) at 18:36, 12 October 2021 (Created page with '{{stub}} Specialized '''hardware for artificial intelligence''' is used to execute artificial intelligence programs faster. == Lisp machines == {{Main|Lisp machine}} {{summarize|section|brevity=y}} Lisp machines were developed in the late 70s and early 80s to make AI programs written in the language Lisp to run faster. == Neural network machines == {{Main|AI accelerator}} Since the 2010s, advances in computer hardware have led to more effici...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Specialized hardware for artificial intelligence is used to execute artificial intelligence programs faster.

Lisp machines

Lisp machines were developed in the late 70s and early 80s to make AI programs written in the language Lisp to run faster.

Neural network machines

Since the 2010s, advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer.[1] By 2019, graphic processing units (GPUs), often with AI-specific enhancements, had displaced CPUs as the dominant method of training large-scale commercial cloud AI.[2] OpenAI estimated the hardware compute used in the largest deep learning projects from AlexNet (2012) to AlphaZero (2017), and found a 300,000-fold increase in the amount of compute required, with a doubling-time trendline of 3.4 months.[3][4]

Sources

  1. ^ Research, AI (23 October 2015). "Deep Neural Networks for Acoustic Modeling in Speech Recognition". airesearch.com. Retrieved 23 October 2015.
  2. ^ "GPUs Continue to Dominate the AI Accelerator Market for Now". InformationWeek. December 2019. Retrieved 11 June 2020.
  3. ^ Ray, Tiernan (2019). "AI is changing the entire nature of compute". ZDNet. Retrieved 11 June 2020.
  4. ^ "AI and Compute". OpenAI. 16 May 2018. Retrieved 11 June 2020.