Jump to content

Talk:Tensor Processing Unit

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Scope creep (talk | contribs) at 18:22, 18 December 2017. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:Findsourcesnotice

adding Nvidia specific solutions?

Nvidia has stated that their DLA unit (maybe part of the Volta accelerator; going to be a part of e.g. the Xavier SoC) is a TPU that Nvidia explicitely open-sourced. See the source links here: Tegra#Xavier. Do you think such a realisation of the concept by a third-party (non-Google) should be added to this article - or shall it receive it's own article? As of now i would prefer the first option. The topic is still compact and seeing the TPU concept evolve over time in a single article might rather benefit the reader - especially because then it is capable of covering all active vendors for this topic. --Alexander.stohr (talk) 15:24, 22 August 2017 (UTC)[reply]

A single source for discussing TPU devices seems like a good idea as this market grows. However, I am not sure if this is the place. Right now this is focused on the Google TPU products, and would require a distinct re-write to make it generic. I am not opposed to a doing so, but I am not sure if there is a consensus that TPU has been genericized. (In another context Nvidia has called the same/similar co-processor as a "tensor core" rather than TPU.[1]) Another idea might be a hardware accelerator section within the TensorFlow article, with links to Nvidia and Google product pages. Thoughts? Dbsseven (talk) 15:38, 22 August 2017 (UTC)[reply]
I would think if independent reliable sources call it a TPU then it would be okay to incorporate the information into this article. News articles may just repeat Nvidia's talking points so it would be best to rely on technical/academic sources. Sizeofint (talk) 18:33, 22 August 2017 (UTC)[reply]

References

Rename to "Google TPU"

Tensor processing unitGoogle TPU – The page describes mainly Google TPUs. I think that's why the company name should be put into a title. 2.92.113.239 (talk) 17:31, 17 December 2017 (UTC)[reply]

  • Oppose – Does not look like a commercial branded product, rather some R&D chip type; other companies make similar chips, even if under different names. Better expand the article to define the generic structure and include all relevant chips. — JFG talk 08:33, 18 December 2017 (UTC)[reply]
  • Oppose I agree with JFG. Look for other AI FPU's that work on the high speed, low precision fpu principle and expand the article accordingly. scope_creep (talk) 10:49, 18 December 2017 (UTC)[reply]
scope creep and @JFG: FYI, there is already a highly related general article: AI accelerator. Therefore I'm not sure this is the place for expansion/generalization, unless there is a lot of Tensor specific content. Dbsseven (talk) 17:25, 18 December 2017 (UTC)[reply]
  • Comment I added a WP:PAID disclosure request to the user, User talk:2.92.113.239, because I think it is push to get some free advertising on the part of Google, re: this article. I'm not saying generalize it, I'm looking to add as much detail as possible. This is one processor of class of processors, that should be described by their architecture and api, not by name. I don't like the AI accelerator article, it is essentially a pamphlet offering free advertising. Dbsseven, Kudos to yourself, I see you have tried to smarten it up a bit, but the article is a first cut, and I think it is rank. Statement like this, (in this article) which are almost fancruft: Google stated the first generation TPU design was memory bandwidth limited, instead of the first generation TPU design was memory bandwidth limited.The first violates WP:NOTAVERTISING. The second doesn't. The point i'm trying is make is. It is new field, and lots of new disruptive designs are coming out, and everybody is trying to find what works, but processor design follows an ethos, a design language, they come of the uni's after decades of research, so I know for a fact there is other ultra high speed, low precision FPU processors out there. This article is about the TPU, but it should have been a architecture articles describing ultra high speed, high bandwidth, low precision FPU processors, with the TPU an example, amongst several others, including a good descritpion of the architecture. scope_creep (talk) 18:21, 18 December 2017 (UTC)[reply]