Talk:Comparison of deep learning software
![]() | Computing: Software / CompSci List‑class Low‑importance | ||||||||||||||||||||||
|
Subpage: Resources
I created the subpage Comparison of deep learning software/Resources to list deep learning software that hasn't been examined yet, and to host links to external pages, since all external links I have added to this page have been removed. —Kri (talk) 15:52, 19 March 2016 (UTC)
Hi. I work at MathWorks and we believe the addition of a MATLAB row to the deep learning software comparison table would add value for readers. What is the recommended approach to get this row added? Thanks, Shyamal. — Preceding unsigned comment added by Shyamal1980 (talk • contribs) 20:02, 6 February 2017 (UTC)
Amazon's DSSTNE
I haven't looked into DSSTNE at all yet, but I assume it should be added to the list Bervin61 (talk) 16:42, 24 May 2016 (UTC)
- Absolutely. DSSTNE is listed in the section Deep learning software not yet covered so the plan is to cover it. —Kri (talk) 15:32, 25 May 2016 (UTC)
Caffe
The article definitely needs to list Caffe — Preceding unsigned comment added by 73.225.48.12 (talk) 07:23, 12 March 2017 (UTC)
Theano OpenCL support
According to their website, Theano can be run on OpenCL with gpuarray now. http://deeplearning.net/software/theano/tutorial/using_gpu.html
Has anyone tested this? Change status to "Yes" or "Partial support"? — Preceding unsigned comment added by Willyfisch (talk • contribs) 14:43, 19 May 2017 (UTC)
Mathematica OpenCL support vs. MXNet's lack of such support
The table claims that MXNet does not have OpenCL support but Mathematica does. The reference cited for Mathematica, a blog post by Stephen Wolfram, states however that Mathematica's deep learning support is based on the MXNet engine:
- It’s worth saying that underneath the whole integrated symbolic interface, the Wolfram Language is using a very efficient low-level library—currently MXNet—which takes care of optimizing ultimate performance for the latest CPU and GPU configurations. By the way, another feature enhanced in 11.1 is the ability to store complete neural net specifications, complete with encoders, etc. in a portable and reusable .wlnet file. [1]
Can someone reconcile these two apparently contradictory claims? While I am no expert in this, supporting OpenCL seems sufficiently low-level that it would be hard to imagine that Mathematica could do it if the underlying deep-learning engine could not. --Saforrest (talk) 15:29, 29 May 2017 (UTC)