Jump to content

Flux (machine-learning framework)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by ManuelRodriguez (talk | contribs) at 07:12, 19 March 2020 (Update a short sentence about the relationship of the flux software with the underlying gpu hardware.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
  • Comment: While indications suggest that it may gain notability, it is not yet notable as far as I can tell. Additionally, I see no indication that the tone concerns were addressed. StudiesWorld (talk) 21:16, 3 July 2019 (UTC)
  • Comment: All (except one) source is connected to the creators of this software. You need to find independent and reliable sources to show this machine learning software is notable. Dreamy Jazz 🎷 talk to me | my contributions 14:19, 2 June 2019 (UTC)


Flux
Original author(s)Michael J Innes.[1]
Stable release
v0.10.1
Repositorygithub.com/FluxML/Flux.jl
Written inJulia
TypeMachine learning library
Websitehttps://fluxml.ai

Flux is an open-source machine-learning library and ecosystem written completely in Julia.[1][2] Its current stable release is v0.10.1.[3] Flux takes full advantage of Julia's just-ahead-of-time compilation and exposes an intuitive and flexible interface to users, while still providing a layer-stacking-based interface for simpler models, and can be readily integrated with other Julia packages.[4] Flux can take full advantage of all Julia language features, and can work with almost all Julia packages. For example, GPU support is supplied transparently by CuArrays.jl, due to Julia's multiple dispatch.[5] This is in contrast to some other machine learning frameworks which are implemented in other languages with Julia bindings, such as TensorFlow.jl, and thus are more limited by the functionality present in the underlying implementation, which is often in C or C++.[6]

This advantage has been used, for example, to implement support for Neural Differential Equations, by fusing Flux and DifferentialEquations.jl into DiffEqFlux.jl[7][8].

Flux supports recurrent and convolutional networks. It is also capable of Differentiable programming[9][10][11] through its source-to-source Automatic differentiation package, Zygote[12].

Julia is among the most popular machine-learning languages in Github[13] and Flux is appointed as its most highly regarded machine-learning repository.[13] A demonstration[14] compiling Julia code to run in Google's Tensor processing unit received praise from Google Brain AI lead Jeff Dean.[15]

Flux was employed to the first application of machine-learning to data encrypted with Homomorphic encryption[16][17] without ever decrypting it. This kind of application is envisioned to be central for privacy to future API using machine-learning models[18].

Flux.jl is an intermediate representation for running high level programs on CUDA hardware.[19][20] It was the predecessor to CUDAnative.jl which is also a GPU programming language.[21]

See also

References

  1. ^ a b Innes, Michael (2018-05-03). "Flux: Elegant machine learning with Julia". Journal of Open Source Software. 3 (25): 602. doi:10.21105/joss.00602.
  2. ^ Innes, Mike; Bradbury, James; Fischer, Keno; Gandhi, Dhairya; Mariya Joy, Neethu; Karmali, Tejan; Kelley, Matt; Pal, Avik; Concetto Rudilosso, Marco; Saba, Elliot; Shah, Viral; Yuret, Deniz. "Building a Language and Compiler for Machine Learning". julialang.org. Retrieved 2019-06-02.
  3. ^ FluxML/Flux.jl v0.10.1, Flux, 2020-01-13, retrieved 2020-01-18
  4. ^ "Machine Learning and Artificial Intelligence". juliacomputing.com. Retrieved 2019-06-02. {{cite web}}: Cite has empty unknown parameter: |dead-url= (help)
  5. ^ Gandhi, Dhairya (2018-11-15). "Julia at NeurIPS and the Future of Machine Learning Tools". juliacomputing.com. Retrieved 2019-06-02. {{cite web}}: Cite has empty unknown parameter: |dead-url= (help)
  6. ^ Malmaud, Jonathan; White, Lyndon (2018-11-01). "TensorFlow.jl: An Idiomatic Julia Front End for TensorFlow". Journal of Open Source Software. 3 (31): 1002. doi:10.21105/joss.01002.
  7. ^ Rackauckas, Chris; Innes, Mike; Ma, Yingbo; Bettencourt, Jesse; White, Lyndon; Dixit, Vaibhav (2019-02-06). "DiffEqFlux.jl - A Julia Library for Neural Differential Equations". arXiv:1902.02376 [cs.LG]. {{cite arXiv}}: Cite has empty unknown parameters: |via= and |volume= (help)
  8. ^ Schlothauer, Sarah (2019-01-25). "Machine learning meets math: Solve differential equations with new Julia library". JAXenter. Retrieved 2019-10-21.
  9. ^ "Flux – Reinforcement Learning vs. Differentiable Programming". fluxml.ai. Retrieved 2019-06-02.
  10. ^ "Flux – What Is Differentiable Programming?". fluxml.ai. Retrieved 2019-06-02.
  11. ^ Heath, Nick (December 6, 2018). "Julia vs Python: Which programming language will rule machine learning in 2019?". TechRepublic. Retrieved 2019-06-03. {{cite web}}: Cite has empty unknown parameter: |dead-url= (help)
  12. ^ Innes, Michael (2018-10-18). "Don't Unroll Adjoint: Differentiating SSA-Form Programs". arXiv:1810.07951 [cs.PL].
  13. ^ a b Heath, Nick (January 25, 2019). "GitHub: The top 10 programming languages for machine learning". TechRepublic. Retrieved 2019-06-03. {{cite web}}: Cite has empty unknown parameter: |dead-url= (help)
  14. ^ Saba, Elliot; Fischer, Keno (2018-10-23). "Automatic Full Compilation of Julia Programs and ML Models to Cloud TPUs". arXiv:1810.09868 [cs.PL].
  15. ^ Dean, Jeff [@JeffDean] (2018-10-23). "Julia + TPUs = fast and easily expressible ML computations" (Tweet). Retrieved 2019-06-02 – via Twitter.
  16. ^ Patrawala, Fatema (2019-11-28). "Julia Computing research team runs machine learning model on encrypted data without decrypting it". Packt Hub. Retrieved 2019-12-11.
  17. ^ "Machine Learning on Encrypted Data Without Decrypting It". juliacomputing.com. 2019-11-22. Retrieved 2019-12-11.
  18. ^ Yadav, Rohit (2019-12-02). "Julia Computing Uses Homomorphic Encryption For ML. Is It The Way Forward?". Analytics India Magazine. Retrieved 2019-12-11.
  19. ^ Roesch, Jared and Lyubomirsky, Steven and Kirisame, Marisa and Pollock, Josh and Weber, Logan and Jiang, Ziheng and Chen, Tianqi and Moreau, Thierry and Tatlock, Zachary (2019). "Relay: A High-Level IR for Deep Learning". arXiv:1904.08368.{{cite arXiv}}: CS1 maint: multiple names: authors list (link) A bot will complete this citation soon. Click here to jump the queue
  20. ^ Tim Besard and Christophe Foket and Bjorn De Sutter (2019). "Effective Extensible Programming: Unleashing Julia on GPUs". IEEE Transactions on Parallel and Distributed Systems. 30 (4). Institute of Electrical and Electronics Engineers (IEEE): 827--841. doi:10.1109/tpds.2018.2872064.
  21. ^ Besard, Tim (2018). Abstractions for Programming Graphics Processors in High-Level Programming Languages (PhD). Ghent University.

Category:Deep learning Category:Free science software Category:Machine learning Category:Software stubs Category:Data mining and machine learning software Category:Free software programmed in Julia Category:Software using the MIT license