Jump to content

Groq

From Wikipedia, the free encyclopedia
Groq, Inc.
Company typePrivate
Industry
Founded2016; 9 years ago (2016)
Founders
  • Jonathan Ross
Headquarters,
US
Key people
Jonathan Ross (CEO), Sunny Madra (COO), Andrew S. Rappaport (Board Member), Chamath Palihapitiya (Investor), John Yetimoglu (Board Member/Investor)
ProductsLanguage Processing Unit (LPU)
RevenueUS$3.2 million (2023)[1]
US$−88 million (2023)[1]
Number of employees
250 (2023)
Websitegroq.com

Groq, Inc. is an American artificial intelligence (AI) company that builds an AI accelerator application-specific integrated circuit (ASIC). The architecture was originally introduced as a Tensor Streaming Processor (TSP) but was later rebranded as a Language Processing Unit (LPU) following the widespread adoption of large language models after the breakthrough of ChatGPT. The company also develops related computer hardware and software to accelerate AI inference performance.

Examples of the types of AI workloads that run on Groq's LPU are: large language models (LLMs),[2][3] image classification,[4] and predictive analysis.[5][6]

Groq is headquartered in Mountain View, CA, and has offices in San Jose, CA, Liberty Lake, WA, Toronto, Canada, London, U.K. and remote employees throughout North America and Europe.

In December 2025, Nvidia and Groq announced an agreement reportedly valued at approximately US$20 billion to license Groq's AI inference technology and to transfer several senior Groq executives to Nvidia.[7][8] Groq stated that it would continue to operate as an independent company.[9]

History

[edit]

Groq was founded in 2016 by a group of former Google engineers, led by Jonathan Ross, one of the designers of the Tensor Processing Unit (TPU), an AI accelerator ASIC, and Douglas Wightman, an entrepreneur and former engineer at Google X (known as X Development), who served as the company’s first CEO.[10][1]

Groq received seed funding from Social Capital's Chamath Palihapitiya, with a $10 million investment in 2017[11] and soon after secured additional funding.

In April 2021, Groq raised $300 million in a series C round led by Tiger Global Management and D1 Capital Partners.[12] Current investors include: The Spruce House Partnership, Addition, GCM Grosvenor, Xⁿ, Firebolt Ventures, General Global Capital, and Tru Arrow Partners, as well as follow-on investments from TDK Ventures, XTX Ventures, Boardman Bay Capital Management, and Infinitum Partners.[13] After Groq’s series C funding round, it was valued at over $1 billion, making the startup a unicorn.[14]

On March 1, 2022, Groq acquired Maxeler Technologies, a company known for its dataflow systems technologies.[15]

On August 16, 2023, Groq selected Samsung Electronics' foundry in Taylor, Texas to manufacture its next-generation chips, on Samsung's 4-nanometer (nm) process node. This was the first order at this new Samsung chip factory.[16]

On February 19, 2024, Groq soft-launched a developer platform, GroqCloud, to attract developers into using the Groq API and rent access to their chips.[17][1] On March 1, 2024 Groq acquired Definitive Intelligence, a startup known for offering a range of business-oriented AI solutions, to help with its cloud platform.[18]

Groq raised $640 million in a series D round led by BlackRock Private Equity Partners in August 2024, valuing the company at $2.8 billion.[1][19]

On February 10, 2025, Groq announced that it had secured a US$1.5 billion commitment from the Kingdom of Saudi Arabia to expand delivery of its LPU-based AI inference infrastructure, tied to a new GroqCloud data center in Dammam, Saudi Arabia.[20][21]

As of 2025, Groq has established a dozen data centers across the U.S., Canada, the Middle East, and Europe, with its technology.[22]

In December 2025, Nvidia agreed to purchase assets from Groq for approximately US$20 billion, which is a record for Nvidia.[23] Groq has described this as a non-exclusive licensing deal.[9] As part of the deal, Groq founder Ross and Groq president Sunny Madra would join Nvidia.[7][8]

Language Processing Unit

[edit]
A die photo of Groq’s first streaming processor

Groq's initial name for their ASIC was the Tensor Streaming Processor (TSP), but later rebranded the TSP as the Language Processing Unit (LPU).[2][24][25]

The LPU features a functionally sliced microarchitecture, where memory units are interleaved with vector and matrix computation units.[26] This design facilitates the exploitation of dataflow locality in AI compute graphs, improving execution performance and efficiency. The LPU was designed off on two key observations:

  1. AI workloads exhibit substantial data parallelism, which can be mapped onto purpose-built hardware, leading to performance gains.[26]
  2. A deterministic processor design, coupled with a producer-consumer programming model, allows for precise control and reasoning over hardware components, allowing for optimized performance and energy efficiency.[26]

In addition to its functionally sliced microarchitecture, the LPU can also be characterized by its single-core, deterministic architecture.[26][27] The LPU can achieve deterministic execution by avoiding the use of traditional reactive hardware components (branch predictors, arbiters, reordering buffers, caches)[26] and by having all execution explicitly controlled by the compiler thereby guaranteeing determinism in execution of an LPU program.[26]

The first generation of the LPU (TSP) yields a computational density of more than 1TeraOp/s per square mm of silicon for its 25×29 mm 14nm chip operating at a nominal clock frequency of 900 MHz.[26] The second generation of the LPU (LPU v2) will be manufactured on Samsung's 4nm process node.[16]

Groq hosts open-source large language models running on its LPUs for public access.[28] Access to these demos is available through Groq's website.

See also

[edit]

References

[edit]
  1. ^ a b c d e Nieva, Richard (August 5, 2024). "The AI Chip Boom Saved This Tiny Startup. Now Worth $2.8 Billion, It's Taking On Nvidia". Forbes.
  2. ^ a b Williams, Wayne (27 February 2024). "'Feels like magic!': Groq's ultrafast LPU could well be the first LLM-native processor — and its latest demo may well convince Nvidia and AMD to get out their checkbooks". TechRadar Pro. TechRadar. Retrieved 19 April 2024.
  3. ^ Ward-Foxton, Sally (12 September 2023). "Groq Demonstrates Fast LLMs on 4-Year-Old Silicon". EETimes. Retrieved 19 April 2024.
  4. ^ Ward-Foxton, Sally (21 January 2020). "Groq's AI Chip Debuts in the Cloud". EETimes. Retrieved 19 April 2024.
  5. ^ Heinonen, Nils. "Researchers accelerate fusion research with Argonne's Groq AI platform". Argonne Leadership Computing Facility. Retrieved 19 April 2024.
  6. ^ Larwood, Mariah; Cerny, Beth. "Argonne deploys new Groq system to ALCF AI Testbed, providing AI accelerator access to researchers globally". Argonne Leadership Computing Facility. Retrieved 19 April 2024.
  7. ^ a b Nellis, Stephan (24 December 2025). "Nvidia, joining Big Tech deal spree, to license Groq technology, hire executives". Reuters. Retrieved 26 December 2025.
  8. ^ a b Silberling, Amanda (24 December 2025). "Nvidia to license AI chip challenger Groq's tech and hire its CEO". TechCrunch. Retrieved 26 December 2025.
  9. ^ a b "Groq and NVIDIA enter non-exclusive inference technology licensing agreement". Groq. 24 December 2025. Retrieved 26 December 2025.
  10. ^ Levy, Ari (21 April 2017). "Several Google engineers have left one of its most secretive AI projects to form a stealth start-up". CNBC. Retrieved 19 April 2024.
  11. ^ Clark, Kate (6 September 2018). "Secretive semiconductor startup Groq raises $52M from Social Capital". TechCrunch. Retrieved 19 April 2024.
  12. ^ King, Ian (14 April 2021). "Tiger Global, D1 Lead $300 Million Round in AI Chip Startup Groq". Bloomberg. Retrieved 19 April 2024.
  13. ^ Wheatly, Mike (14 April 2021). "AI chipmaker Groq raises $300M in Series C round". Silicon Angle. Retrieved 19 April 2024.
  14. ^ Andonov, Kaloyan; Lavine, Rob (19 April 2021). "Analysis: Groq computes a $300m series C". Global Venturing. Retrieved 19 April 2024.
  15. ^ Prickett Morgan, Timothy (2 March 2022). "GROQ BUYS MAXELER FOR ITS HPC AND AI DATAFLOW EXPERTISE". The Next Platform. Retrieved 19 April 2024.
  16. ^ a b Hwang, Jeong-Soo. "Samsung's new US chip fab wins first foundry order from Groq". The Korea Economic Daily. Retrieved 19 April 2024.
  17. ^ Franzen, Carl (March 2024). "Groq launches developer playground GroqCloud with newly acquired Definitive Intelligence". Venture Beat. Retrieved 19 April 2024.
  18. ^ Wiggers, Kyle (March 2024). "AI chip startup Groq forms new business unit, acquires Definitive Intelligence". TechCrunch. Retrieved 19 April 2024.
  19. ^ Wiggers, Kyle (2024-08-05). "AI chip startup Groq lands $640M to challenge Nvidia". TechCrunch. Retrieved 2024-08-26.
  20. ^ "Saudi Arabia Announces $1.5 Billion Expansion to Fuel AI-powered Economy with Groq". Groq. February 10, 2025. Retrieved 24 December 2025.
  21. ^ "AI chip startup Groq secures $1.5 billion commitment from Saudi Arabia". Reuters. February 10, 2025. Retrieved 24 December 2025.
  22. ^ Kao, Kimberley (2025-10-03). "AI Unicorn Groq Charts Data-Center Expansion Plan". The Wall Street Journal. Retrieved 2025-10-04.
  23. ^ Faber, David (24 December 2025). "Exclusive: Nvidia buying AI chip startup Groq's assets for about $20 billion in its largest deal on record". CNBC. Retrieved 26 December 2025.
  24. ^ Mellor, Chris (23 January 2024). "Grokking Groq's Groqness". Blocks & Files. Retrieved 19 April 2024.
  25. ^ Abts, Dennis; Ross, Jonathan; Sparling, Jonathan; Wong-VanHaren, Mark; Baker, Max; Hawkins, Tom; Bell, Andrew; Thompson, John; Kahsai, Temesghen; Kimmell, Garrin; Hwang, Jennifer; Leslie-Hurd, Rebekah; Bye, Michael; Creswick, E.R.; Boyd, Matthew; Venigalla, Mahitha; Laforge, Evan; Purdy, Jon; Kamath, Purushotham; Maheshwari, Dinesh; Beidler, Michael; Rosseel, Geert; Ahmad, Omar; Gagarin, Gleb; Czekalski, Richard; Rane, Ashay; Parmar, Sahil; Werner, Jeff; Sproch, Jim; Macias, Adrian; Kurtz, Brian (May 2020). "Think Fast: A Tensor Streaming Processor (TSP) for Accelerating Deep Learning Workloads" (PDF). 2020 ACM/IEEE 47th Annual International Symposium on Computer Architecture (ISCA). pp. 145–158. doi:10.1109/ISCA45697.2020.00023. ISBN 978-1-7281-4661-4.
  26. ^ a b c d e f g Abts, Dennis; Kimmell, Garrin; Ling, Andrew; Kim, John; Boyd, Matt; Bitar, Andrew; Parmar, Sahil; Ahmed, Ibrahim; Dicecco, Roberto; Han, David; Thompson, John; Bye, Michael; Hwang, Jennifer; Fowers, Jeremy; Lillian, Peter; Murthy, Ashwin; Mehtabuddin, Elyas; Tekur, Chetan; Sohmers, Thomas; Kang, Kris; Maresh, Stephen; Ross, Jonathan (June 11, 2022). "A software-defined tensor streaming multiprocessor for large-scale machine learning". Proceedings of the 49th Annual International Symposium on Computer Architecture. pp. 567–580. doi:10.1145/3470496.3527405. ISBN 978-1-4503-8610-4. Retrieved 2024-03-18.
  27. ^ Singh, Satnam (February 11, 2022). "The Virtuous Cycles of Determinism: Programming Groq's Tensor Streaming Processor". Proceedings of the 2022 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays. p. 153. doi:10.1145/3490422.3510453. ISBN 978-1-4503-9149-8. Retrieved 2024-03-18.
  28. ^ Morrison, Ryan (27 February 2024). "Meet Groq — the chip designed to run AI models really, really fast". Tom’s Guide. Retrieved 19 April 2024.
[edit]