Talk:Tensor Processing Unit
![]() | This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
|
adding Nvidia specific solutions?
Nvidia has stated that their DLA unit (maybe part of the Volta accelerator; going to be a part of e.g. the Xavier SoC) is a TPU that Nvidia explicitely open-sourced. See the source links here: Tegra#Xavier. Do you think such a realisation of the concept by a third-party (non-Google) should be added to this article - or shall it receive it's own article? As of now i would prefer the first option. The topic is still compact and seeing the TPU concept evolve over time in a single article might rather benefit the reader - especially because then it is capable of covering all active vendors for this topic. --Alexander.stohr (talk) 15:24, 22 August 2017 (UTC)
- A single source for discussing TPU devices seems like a good idea as this market grows. However, I am not sure if this is the place. Right now this is focused on the Google TPU products, and would require a distinct re-write to make it generic. I am not opposed to a doing so, but I am not sure if there is a consensus that TPU has been genericized. (In another context Nvidia has called the same/similar co-processor as a "tensor core" rather than TPU.[1]) Another idea might be a hardware accelerator section within the TensorFlow article, with links to Nvidia and Google product pages. Thoughts? Dbsseven (talk) 15:38, 22 August 2017 (UTC)
- I would think if independent reliable sources call it a TPU then it would be okay to incorporate the information into this article. News articles may just repeat Nvidia's talking points so it would be best to rely on technical/academic sources. Sizeofint (talk) 18:33, 22 August 2017 (UTC)
Rename to "Google TPU"
![]() | It has been proposed in this section that Tensor Processing Unit be renamed and moved to Google TPU. A bot will list this discussion on the requested moves current discussions subpage within an hour of this tag being placed. The discussion may be closed 7 days after being opened, if consensus has been reached (see the closing instructions). Please base arguments on article title policy, and keep discussion succinct and civil. Please use {{subst:requested move}} . Do not use {{requested move/dated}} directly. |
Tensor processing unit → Google TPU – The page describes mainly Google TPUs. I think that's why the company name should be put into a title. 2.92.113.239 (talk) 17:31, 17 December 2017 (UTC)
- Oppose – Does not look like a commercial branded product, rather some R&D chip type; other companies make similar chips, even if under different names. Better expand the article to define the generic structure and include all relevant chips. — JFG talk 08:33, 18 December 2017 (UTC)
- Oppose I agree with JFG. Look for other AI FPU's that work on the high speed, low precision fpu principle and expand the article accordingly. scope_creep (talk) 10:49, 18 December 2017 (UTC)
- All unassessed articles
- Start-Class Computer science articles
- Mid-importance Computer science articles
- WikiProject Computer science articles
- Start-Class Computing articles
- Mid-importance Computing articles
- Start-Class Computer hardware articles
- Mid-importance Computer hardware articles
- Start-Class Computer hardware articles of Mid-importance
- All Computing articles
- Start-Class Google articles
- Mid-importance Google articles
- WikiProject Google articles
- Requested moves