Jump to content

Fast Artificial Neural Network

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 202.167.250.43 (talk) at 04:23, 28 February 2018 (History). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Original author(s)Steffen Nissen
Initial releaseNovember 2003; 21 years ago (2003-11)
Stable release
2.2.0 / 24 January 2012; 13 years ago (2012-01-24)
Written inC
Operating systemCross-platform
Size~2 MB
TypeLibrary
LicenseLGPL
Websiteleenissen.dk/fann/wp/,
github.com/libfann/fann

Fast Artificial Neural Network (FANN) is cross-platform open source programming library for developing multilayer feedforward Artificial Neural Networks.

Characteristics

FAN supports cross-platform execution of single and multilayer networks. It also supports fixed point and floating point arithmetic. It includes functions that simplify the creating, training and testing of neural networks. It has bindings for over 20 programming languages, including commonly used languages such as C# and python.
In the FANN website multiple graphical user interfaces are available for use with the library such as FANNTool, Agiel Neural Network, Neural View, FannExeplorer, sfann and others. These graphical interface facilitate the use of FANN for users that are not very familiar with programming or for users who are seeking for a simple out of the box solution.
Training for FANN is carried out through backpropagation. The internal training functions are optimized to decrease the training time.
Trained Artificial Neural Networks can be stores as .net files to quickly saved and load ANNs for future use or future training. This allows the user to partition the training in multiple steps which can be useful when dealing with large training datasets or sizable neural networks.

History

FANN was originally written by Steffen Nissen. Its original implementation is described in Nissen’s 2003 report Implementation of a Fast Artificial Neural Network Library (FANN).[1] This report was submitted to the computer science department at the University of Copenhagen (DIKU). In his original report Nissen describes that one of his primary motivations in writing FANN was developing a neural network library that was friendly to both, fixed point, and floating point arithmetic. Nissen wanted to develop an autonomous agent that can learn from experience. His goal was to use this autonomous agent to create a virtual player in Quake III Arena that can learn from game play.
Since its original 1.0.0 version release, the library’s functionality has been expanded by the creator and its many contributors to include more practical constructors, different activation functions, simpler access to parameters and bindings to multiple programming languages. It has been downloaded 450,000 times since its move to Source Forge in 2003 and 29,000 times in 2016 alone.

The source code is now hosted on github. The project is inactive since Nov 2015, there are a number of pull requests pending and in the issue section some users mention that the author is no longer contactable.

Research

The original FANN report written by Steffen Nissen has been cited 337 times per google scholar. The library has been used for research in image recognition, machine learning, biology, genetics, aerospace engineering, environmental sciences and artificial intelligence.
Some notable publications that have cited FANN include:

  • Supervised pattern classification based on optimum-path forest[2]
  • Efficient supervised optimum-path forest classification for large datasets[3]
  • A Multilevel Mixture-of-Experts Framework for Pedestrian Classification[4]
  • A stochastic model updating technique for complex aerospace structures[5]
  • Prediction of Local Structural Stabilities of Proteins from Their Amino Acid Sequences[6]

Language Bindings

While FANN was originally written in C, the following language bindings have been created by FANN contributors:

See also

References

  1. ^ Nissed, Steffen. Implementation of a Fast Artificial Neural Network Library (FANN). Department of Computer Science University of Copnhagen (DIKU), 2003
  2. ^ Papa, J.P. Supervised pattern classification based on optimum-path forest. International Journal of Imaging Systems and Technology, 2009
  3. ^ Papa, J.P. Efficient supervised optimum-path forest classification for large datasets. Pattern Recognition, 2012
  4. ^ Enzweiler, M. A Multilevel Mixture-of-Experts Framework for Pedestrian Classification. IEEE Transactions on Image Processing, 2011
  5. ^ Goller, B. A stochastic model updating technique for complex aerospace structures. Finite Elements in Analysis and Design, 2011
  6. ^ Tartaglia, G. G. Prediction of Local Structural Stabilities of Proteins from Their Amino Acid Sequences. Structure, 2006