Talk:Tensor Processing Unit
![]() | This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
|
adding Nvidia specific solutions?
Nvidia has stated that their DLA unit (maybe part of the Volta accelerator; going to be a part of e.g. the Xavier SoC) is a TPU that Nvidia explicitely open-sourced. See the source links here: Tegra#Xavier. Do you think such a realisation of the concept by a third-party (non-Google) should be added to this article - or shall it receive it's own article? As of now i would prefer the first option. The topic is still compact and seeing the TPU concept evolve over time in a single article might rather benefit the reader - especially because then it is capable of covering all active vendors for this topic. --Alexander.stohr (talk) 15:24, 22 August 2017 (UTC)
- A single source for discussing TPU devices seems like a good idea as this market grows. However, I am not sure if this is the place. Right now this is focused on the Google TPU products, and would require a distinct re-write to make it generic. I am not opposed to a doing so, but I am not sure if there is a consensus that TPU has been genericized. (In another context Nvidia has called the same/similar co-processor as a "tensor core" rather than TPU.[1]) Another idea might be a hardware accelerator section within the TensorFlow article, with links to Nvidia and Google product pages. Thoughts? Dbsseven (talk) 15:38, 22 August 2017 (UTC)
- I would think if independent reliable sources call it a TPU then it would be okay to incorporate the information into this article. News articles may just repeat Nvidia's talking points so it would be best to rely on technical/academic sources. Sizeofint (talk) 18:33, 22 August 2017 (UTC)
Rename to "Google TPU"
![]() | It has been proposed in this section that Tensor Processing Unit be renamed and moved to Google TPU. A bot will list this discussion on the requested moves current discussions subpage within an hour of this tag being placed. The discussion may be closed 7 days after being opened, if consensus has been reached (see the closing instructions). Please base arguments on article title policy, and keep discussion succinct and civil. Please use {{subst:requested move}} . Do not use {{requested move/dated}} directly. |
Tensor processing unit → Google TPU – The page describes mainly Google TPUs. I think that's why the company name should be put into a title. 2.92.113.239 (talk) 17:31, 17 December 2017 (UTC)
- Oppose – Does not look like a commercial branded product, rather some R&D chip type; other companies make similar chips, even if under different names. Better expand the article to define the generic structure and include all relevant chips. — JFG talk 08:33, 18 December 2017 (UTC)
- Oppose I agree with JFG. Look for other AI FPU's that work on the high speed, low precision fpu principle and expand the article accordingly. scope_creep (talk) 10:49, 18 December 2017 (UTC)
- scope creep and @JFG: FYI, there is already a highly related general article: AI accelerator. Therefore I'm not sure this is the place for expansion/generalization, unless there is a lot of Tensor specific content. Dbsseven (talk) 17:25, 18 December 2017 (UTC)
- Oppose - I'm not sure adding "Google" aids the article in any way. Right now there isn't a need for further specificity in the title (IMO). Dbsseven (talk) 17:25, 18 December 2017 (UTC)
- Comment I added a WP:PAID disclosure request to the user, User talk:2.92.113.239, because I think it is push to get some free advertising on the part of Google, re: this article. I'm not saying generalize it, I'm looking to add as much detail as possible. This is one processor of class of processors, that should be described by their architecture and api, not by name. I don't like the AI accelerator article, it is essentially a pamphlet offering free advertising. Dbsseven, Kudos to yourself, I see you have tried to smarten it up a bit, but the article is a first cut, and I think it is rank. Statement like this, (in this article) which are almost fancruft: Google stated the first generation TPU design was memory bandwidth limited, instead of the first generation TPU design was memory bandwidth limited.The first violates WP:NOTAVERTISING. The second doesn't. The point i'm trying is make is. It is new field, and lots of new disruptive designs are coming out, and everybody is trying to find what works, but processor design follows an ethos, a design language, they come of the uni's after decades of research, so I know for a fact there is other ultra high speed, low precision FPU processors out there. This article is about the TPU, but it should have been a architecture articles describing ultra high speed, high bandwidth, low precision FPU processors, with the TPU an example, amongst several others, including a good descritpion of the architecture. scope_creep (talk) 18:21, 18 December 2017 (UTC)
- All unassessed articles
- Start-Class Computer science articles
- Mid-importance Computer science articles
- WikiProject Computer science articles
- Start-Class Computing articles
- Mid-importance Computing articles
- Start-Class Computer hardware articles
- Mid-importance Computer hardware articles
- Start-Class Computer hardware articles of Mid-importance
- All Computing articles
- Start-Class Google articles
- Mid-importance Google articles
- WikiProject Google articles
- Requested moves