Receptron
This article is an orphan, as no other articles link to it. Please introduce links to this page from related articles. (November 2025) |
The receptron (short for "reservoir perceptron") is a neuromorphic data processing model — specifically neuromorphic computing — that generalizes the traditional perceptron, by incorporating non-linear interactions between inputs.[1][2][3] Unlike classical perceptron, which rely on linearly independent weights, the receptron leverages complexity in physical substrates,[4] such as the electric conduction properties of nanostructured materials or optical speckle fields, to perform classification tasks.[5][6] The receptron bridges unconventional computing and neural network principles,[7] enabling solutions that do not require the training approaches typical of artificial neural networks based on the perceptron model.[8]
Algorithm
[edit]The receptron is an algorithm for supervised learning of binary classifiers, so a classification algorithm that makes its predictions based on a predictor function, combining a set of weights with the feature vector[9]. The mathematical model is based on the sum of inputs with non-linear interactions:
(1)
where and are non-linear weight functions depending on the inputs, . Nonlinearity will typically make the system extremely complex, and allowing for the solution of problems not solvable through the simpler rules of a linear system, such as the perceptron or McCulloch Pitts neurons, which is based on the sum of linearly independent weights[10]:
(2)
where are constant real values. A consequence of this simplicity is the limitation to linearly separable functions, which necessitates multi-layer architectures and training algorithms like backpropagation[11]
As in the perceptron case[12], the summation in Eq. 1 origins the activation of the receptron output through the thresholding process,
(3)
where th is a constant threshold parameter. Equation 3 can be written by using the Heaviside step function.
The weight functions can be written with a finite number of parameters , simplifying the model representation. One can Taylor-expand and use the idempotency of Boolean variables such that can be written as
(4)
where are independent parameters that can be seen as the components of a tensor (“weight tensor”) of rank and type .
The sum in Eq. [3] reduces to the perceptron case when off-diagonal terms of vanish. If one considers , one gets:
(5)
in the perceptron case, the vanishing of implies linearity . In the receptron case , meaning that the superposition principle is no longer valid, the latter terms being responsible of the more complex non-linear interaction between the inputs.
Design and implementations
[edit]1. Electrical Receptron
[edit]Substrate: Nanostructured and nanocomposite films (Au, Pt, Zr Au/Zr). These films form disordered networks of nanoparticles with resistive switching and non-linear electrical conduction.
2. Optical Receptron
[edit]Substrate: Optical speckle fields generated by random interference of light emerging from a disordered medium illuminated by a laser or coherent radiation[13].
Key features
[edit]Physical Substrate Computing: The receptron does not require digital training; instead, it exploits the natural complexity of materials (e.g., nanowire networks, diffractive media) to perform computations.
Non-Linear Separability: Unlike traditional perceptrons, which fail on problems like the XOR function, the receptron can solve such tasks due to its inherent non-linearity.
Training-Free Operation: Classification is achieved through the physical system's response rather than iterative weight adjustments, reducing computational overhead.
References
[edit]- ^ Mirigliano, Matteo; Paroli, Bruno; Martini, Gianluca; Fedrizzi, Marco; Falqui, Andrea; Casu, Alberto; Milani, Paolo (2021-12-01). "A binary classifier based on a reconfigurable dense network of metallic nanojunctions". Neuromorphic Computing and Engineering. 1 (2): 024007. doi:10.1088/2634-4386/ac29c9. ISSN 2634-4386.
- ^ Paroli, B.; Borghi, F.; Potenza, M. A. C.; Milani, P. (2025-06-24), The receptron is a nonlinear threshold logic gate with intrinsic multi-dimensional selective capabilities for analog inputs, arXiv:2506.19642
- ^ Perez, Jake C.; Shaheen, Sean E. (August 2020). "Neuromorphic-based Boolean and reversible logic circuits from organic electrochemical transistors". MRS Bulletin. 45 (8): 649–654. Bibcode:2020MRSBu..45..649P. doi:10.1557/mrs.2020.202. ISSN 0883-7694.
- ^ Stieg, Adam Z.; Avizienis, Audrius V.; Sillin, Henry O.; Martin-Olmos, Cristina; Aono, Masakazu; Gimzewski, James K. (2012-01-10). "Emergent Criticality in Complex Turing B-Type Atomic Switch Networks". Advanced Materials. 24 (2): 286–293. Bibcode:2012AdM....24..286S. doi:10.1002/adma.201103053. ISSN 0935-9648. PMID 22329003.
- ^ Paroli, B.; Martini, G.; Potenza, M. A. C.; Siano, M.; Mirigliano, M.; Milani, P. (2023-09-01). "Solving classification tasks by a receptron based on nonlinear optical speckle fields". Neural Networks. 166: 634–644. doi:10.1016/j.neunet.2023.08.001. ISSN 0893-6080. PMID 37604074. Archived from the original on 2024-04-18. Retrieved 2025-09-03.
- ^ Iyer, Prasad P.; Bhatt, Gaurang R.; Desai, Saaketh; Fuller, Elliot J.; Teeter, Corinne M.; Léonard, François; Vineyard, Craig M. (2025-08-08). "Is Computing with Light All You Need? A Perspective on Codesign for Optical Artificial Intelligence and Scientific Computing". Advanced Intelligent Systems 2500371. doi:10.1002/aisy.202500371. ISSN 2640-4567.
- ^ Frenkel, Charlotte; Bol, David; Indiveri, Giacomo (June 2023). "Bottom-Up and Top-Down Approaches for the Design of Neuromorphic Processing Systems: Tradeoffs and Synergies Between Natural and Artificial Intelligence". Proceedings of the IEEE. 111 (6): 623–652. doi:10.1109/JPROC.2023.3273520. ISSN 0018-9219.
- ^ Barrows, Frank; Lin, Jonathan; Caravelli, Francesco; Chialvo, Dante R. (July 2025). "Uncontrolled Learning: Codesign of Neuromorphic Hardware Topology for Neuromorphic Algorithms". Advanced Intelligent Systems. 7 (7) 2400739. doi:10.1002/aisy.202400739. ISSN 2640-4567.
- ^ Widrow, B.; Lehr, M.A. (September 1990). "30 years of adaptive neural networks: perceptron, Madaline, and backpropagation". Proceedings of the IEEE. 78 (9): 1415–1442. Bibcode:1990IEEEP..78.1415W. doi:10.1109/5.58323.
- ^ Shukla, Anupam; Tiwari, Ritu; Kala, Rahul (2010), "Artificial Neural Networks", Towards Hybrid and Adaptive Computing, vol. 307, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 31–58, doi:10.1007/978-3-642-14344-1_2, ISBN 978-3-642-14343-4, retrieved 2025-11-06
- ^ Goh, A.T.C. (January 1995). "Back-propagation neural networks for modeling complex systems". Artificial Intelligence in Engineering. 9 (3): 143–151. doi:10.1016/0954-1810(94)00011-S.
- ^ Block, H. D. (1962-01-01). "The Perceptron: A Model for Brain Functioning. I". Reviews of Modern Physics. 34 (1): 123–135. Bibcode:1962RvMP...34..123B. doi:10.1103/RevModPhys.34.123. ISSN 0034-6861.
- ^ Paroli, Bruno; Malfer, Alessandro; Potenza, Marco A.C.; Siano, Mirko; Milani, Paolo (2025-08-21). "Binary Pattern Classification with a Photonic Neuromorphic Device Based on Optical Receptrons". Laser & Photonics Reviews e00970. doi:10.1002/lpor.202500970. ISSN 1863-8880.