Lateral computing
1. INTRODUCTION
The traditional or conventional approach to solving computing problems is to either build mathematical models or have a IF- THEN -ELSE structure. The example for arriving at a solution in case of a chess problem is using brute force search i.e., thinking logically through exhaustively searching the moves(F.H Hsu, 2002). However, this approach is computationally expensive and sometimes may arrive at poor solution such as in the case of pattern recognition, optimisation and certain control problems. Lateral-computing is lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono( E.De Bono, 1990). This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing technique, a computationally inexpensive, easy to implement, efficient, innovative unconventional technique is arrived at for the computing problem at hand.
A simple problem of truck back up can be used for illustrating the lateral-computing. This is one of the difficult tasks for traditional computing techniques, and has been efficiently solved by the use of fuzzy logic which is a lateral computing technique. Lateral-computing sometimes arrives at a novel solution for particular computing problem by using the model of how living beings such as humans, ants, honeybee solve the problem or how pure crystals are formed by annealing, or evolution of living beings or quantum mechanics etc.
Contents 1. Review of lateral thinking techniques and introduce the lateral-computing.
2. Lateral computing techniques which are well established are presented.
3. Several examples of successful applications of lateral-computing.
4. Summary and conclusions.
5. References
LOGICAL THINKING AND ARTIFICIAL INTELLIGENCE
Chess position analysis can be used to illustrate the logical thinking. The figure 1 shows a chess problem which has to be solved with 2 moves.
Figure 1: Chess Problem, White to move and Check mate
The white has several options to make a move and checkmate the black. The move Rd5xRd7 or Rf7 x Rd7 will immediately provide material advantage to white. There are similar moves which capture pieces and provide immediate material advantages to the white. But a knight move Ktc6 which does not provide any material advantage, provides a solution for checkmate for black in two moves.
.. Ktc6 1 ... Kxf7 2 g8Q++ 1 ... Kxd5 2 Qa2++ 1 ... Rdxd5 2 Re7++ 1 ... Rfxd5 2 Rf6++ 1 ... Rdxf7 2 Rd6++ 1 ... Rfxf7 2 Re5++
This is an example which illustrates the use of logical thinking. The logical thinking in chess progresses by evaluating the immediate material gain in each move. This will result in a solution which will require more number of moves or failure to checkmate. However, the not so obvious move of knight results in a very powerful checkmate. Even though this move does not look logical, it is the solution to 2 move check mate problem. A computer programmed to play chess will thus miss out some good opportunities if it does a brute force search to find moves logically. Several attempts have been made to build the powerful chess computers in history( Hsu, 2002). But these chess computers have been defeated by Grand master human chess players.
Logic Programming: http://en.wikipedia.org/wiki/Logic_programming
The attempts to use logic programming such as prolog to represent knowledge and build artificial intelligent systems has not provided the anticipated thrust to solving interesting problems( Boden, 1990 and Russel and Novig, 2003). The lack of generalization and learning capability of these systems and exponential growth of the IF-THEN ELSE rules has made this approach unpopular. An example to illustrate the failure of the rule-based system is the following flawed proof:
Start with 81/4 = 81/4
Adding -20 to LHS and RHS we get: -20 +81/4 = -20 + 81/4
Splitting –20 as ( -36 + 16 ) on the LHS and (-45 +25 ) on the RHS: 16 + 81/4 -36 = -45 +81/4 +25
Now expressing the terms 16, 25 and 81/4 as squares of 4, 5 and 9/2 respectively: 4^2 + (9/2)^2 - 2*(9/2) *4 = 5^2 + (9/2)^2 – 2 * (9/2) * 5
Expressing this as a^2 + b^2 –2*a*b = (a-b)^2 we get: (4- 9/2)^2 = ( 5-9/2)^2
Taking the square roots, 4-9/2 = 5 –9/2
This would amazingly provide that 4 = 5 which a wrong result. While taking the squre roots, we have missed the step of considering the signs. This has resulted in an absurd outcome. A rule based system, even if it missed a simple rule in its database may yield such an unacceptable output.
FROM LATERAL THINKING TO LATERAL-COMPUTING
Lateral Thinking ( http://en.wikipedia.org/wiki/Lateral_thinking ) is technique for creative thinking for solving problems( E De Bono, 1990). The brain as center of thinking has a self organizing information system. It tends to create patterns and traditional thinking process uses them to solve problems. Lateral thinking technique proposes to escape from this patterning to arrive at better solutions through new ideas. Provocative use of information processing is the basic underlying principle of lateral thinking,
The provocative operator PO: PO is something which characterizes the lateral thinking. Its function is to generate new ideas by provocation and providing escape route from old ideas. It creates a provisional arrangement of information.
Water-logic: Water logic is contrast to traditional or rock logic( E. De Bono, 1991). Water logic has boundaries which depends on the circumstances and conditions while rock logic has hard boundaries. Water logic in someways resembles the fuzzy logic.
Transition to Lateral-Computing:
The lateral computing does a provocative use of information processing similar to lateral-thinking. This is explained with the use of evolutionary computing which is a very useful lateral-computing technique. The evolution proceeds by change and selection. While random mutation provides change, the selection is through survival of the fittest. The random mutation works as a provocative information processing and provides a new avenue for generating better solutions for the computing problem.
Lateral-Computing takes the analogies from real world examples such as
Annealing : how slow cooling of the hot gaseous state results in pure crystals
How the neural networks in the brain solve such problems as face and speech recognition
How simple insects such as ants, honeybees solve some sophisticated problems
Evolution of human beings from molecular life forms are mimicked by evolutionary computing.
How living organisms defend themselves against diseases, heal their wounds etc
How electricity is distributed by grids
Differentiating factors of “Lateral Computing”:
Does not directly approach the problem through mathematical means.
Uses indirect models or looks for analogies to solve the problem
Radically different from what is in vogue such as using “photons” for computing in optical computing. This is rare as most of the conventional computers use electrons to carry signals.
Sometimes the Lateral Computing techniques are surprisingly simple and deliver high performance solutions to very complex problems
Some of the techniques in lateral computing use “unexplained jumps”. These jumps may not look logical. The example is the use of “Mutation” operator in genetic algorithms.
a) Convention – Lateral : It is very hard to draw a clear boundary between conventional and Lateral computing. Over a period of time, some unconventional computing techniques become integral part of mainstream computing. So there will always be an overlap between conventional and lateral computing. It will be tough task classifying a computing tecahnique as a conventional or lateral computing technique as shown in the figure. The boundaries are fuzzy and one may approach with fuzzy sets.
Formal Definition:
Lateral Computing is a fuzzy set of all computing techniques which use unconventional computing approach. Hence Lateral Computing includes those techniques which use semi conventional or hybrid computing. The degree of membership for lateral computing techniques is greater than 0 in the fuzzy set of unconventional computing techniques. Block quote
The following table brings out some important differentiators for Lateral Computing:
Conventional Computing: -The problem and technique are directly correlated -Treats the problem with rigorous mathematical analysis. -Creates mathematical models. -The computing technique can be analyzed mathematically.
Lateral Computing: -The problem may hardly have any relation to the computing technique used -Approaches problems by analogies such as human information processing model, annealing etc. -Sometimes the computing technique cannot be mathematically analyzed.
c) Lateral Computing and Parallel Computing:
The parallel computing focuses on improving the performance of the computers/algorithms through the use of several computing elements ( also processing elements)( K. Hwang 1993). The computing speed is improved by using several computing elements. The parallel computing is an extension of conventional sequential computing. However, in lateral computing, the problem is solved using unconventional information processing whether using a sequential or parallel computing.
A REVIEW OF LATERAL-COMPUTING TECHNIQUES
There are several computing techniques which fit the Lateral computing paradigm. Here a brief description of some of the Lateral Computing techniques are provided:
Swarm Intelligence: http://en.wikipedia.org/wiki/Swarm_intelligence Swarm Intelligence (SI) is the property of a system whereby the collective behaviours of (unsophisticated) agents interacting locally with their environment cause coherent functional global patterns to emerge(Bonabeau et al., 1999). SI provides a basis with which it is possible to explore collective (or distributed) problem solving without centralized control or the provision of a global model.
One of the interesting swarm intelligent technique Ant Colony algorithm( Dorigo et al. 1999):
Ants are behaviorally unsophisticated; collectively perform complex tasks. Ants have highly developed sophisticated sign-based communication
Communicate using pheromones; trails are laid that can be followed by other ants.
Routing Problem Ants dropping different pheromones used to compute “shortest” path from source to destination(s).
Agent Based Systems:
Agents are encapsulated computer systems that are situated in some environment and are capable of flexible, autonomous action in that environment in order to meet their design objectives( Bradshaw, 1997). Agents are considered to be autonomous (i.e., independent, not-controllable), reactive (i.e., responding to events), pro-active (i.e., initiating actions of their own volition), and social (i.e., communicative). Agents vary in their abilities; e.g. they can be static or mobile, or may or may not be intelligent. Each agent may have its own task and/or role. Agents, and multi-agent systems are used as a metaphor to model complex distributed processes. Such agents invariably need to interact with one another in order to manage their inter-dependencies. These interactions involve agents cooperating, negotiating and coordinating with one another.
Agent-based Systems are computer programs that try to simulate various complex phenomena via virtual "agents" that represent the components of a business system. The behaviors of these agents are programmed with rules that realistically depict how business is conducted. As widely varied individual agents interact in the model, the simulation shows how their collective behaviors govern the performance of the entire system - for instance, the emergence of a successful product or an optimal schedule. These simulations are powerful strategic tools for "what-if" scenario analysis: as managers change agent characteristics or "rules," the impact of the change can be easily seen in the model output
Grid Computing: http://en.wikipedia.org/wiki/Grid_Computing
By analogy, a computational grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities(Ian Foster, 1999). The applications of Grid computing are in :
Chip design, Cryptographic problems, Medical instrumentation, Supercomputing,
Distributed supercomputing applications use grids to aggregate substantial computational resources in order to tackle problems that cannot be solved on a single system
Autonomic Computing: http://en.wikipedia.org/wiki/Autonomic_Computing The autonomic nervous system governs our heart rate and body temp, thus freeing our conscious brain from the burden of dealing with these and many other low-level, yet vital functions. The essence of the autonomic computing is self management, the intent of which is to free system adminstrators from the details of system operation and maintenance( R. Murch, 2004).
Four aspects of Autonomic computing are:
Self Configuration,
Self optimization,
Self Healing,
Self Protection.
This is a grand challenge promoted by IBM ( IBM, 2004).
Optical Computing: http://en.wikipedia.org/wiki/Optical_computing
Optical computing is to use photons rather than conventional electrons for computing( Karim and Awwal, 1992). There are quite a few instances of optical computers and successful use of them. The conventional logic gates use semiconductors which use electrons for transporting the signals. In case of optical computers, the photons in a light beam are used to do useful computation.
There are numerous advantages of using optical devices for computing such as immunity to electromagnetic interference, large bandwidth etc.
DNA Computing: http://en.wikipedia.org/wiki/DNA_computing To use strands of DNA to encode the instance of the problem and to manipulate them using techniques commonly available in any molecular biology laboratory in order to simulate operations that select the solution of the problem if it exists
Since the DNA molecule is also a code, but is instead made up of a sequence of four bases that pair up in a predictable manner, many scientists have thought about the possibility of creating a molecular computer. These computers rely on the much faster reactions of DNA nucleotides binding with their complements, a brute force method that holds enormous potential for creating a new generation of computers that would be 100 billion times faster than today's fastest PC. DNA computing has been heralded as the "first example of true nanotechnology", and even the "start of a new era," which forges an unprecedented link between computer science and life science.
Example applications of DNA Computing are in solution for the Hamiltonian path problem which is a known NP complete one. The number of required lab operations using DNAs grows linearly with n that is the number of vertices of the graph( N. Pisanti 1997). Molecular algorithms have been reported that solves the cryptograhic problem in a polynomial number of steps. As known factoring large numbers is a relevant problem in many cryptographic applications.
Quantum Computing: http://en.wikipedia.org/wiki/Quantum_computing
In a quantum computer, the fundamental unit of information (called a quantum bit or qubit), is not binary but rather more quaternary in nature(Braunstein, 1999, Fortnow, 2003). This qubit property arises as a direct consequence of its adherence to the laws of quantum mechanics which differ radically from the laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states. In other words, a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient representing the probability for each state. a quantum computer manipulates qubits by executing a series of quantum gates, each a unitary transformation acting on a single qubit or pair of qubits. In applying these gates in succession, a quantum computer can perform a complicated unitary transformation to a set of qubits in some initial state
Reconfigurable Computing: http://en.wikipedia.org/wiki/Reconfigurable_computing
The Field Programmable Gate Arrays(FPGA) are making it possible for building a truly reconfigurable computing( Suthikshn, 1996). The computer architecture is transformed by on the fly reconfiguration of the FPGA circuitry. The optimal matching between the architecture and the algorithm improves the performance of the reconfigurable computer The key feature is the hardware performance and software flexibility.
For several applications such as finger print matching, DNA Sequence Comparison, etc., the reconfigurable computers have been shown to perform several orders better than conventional computers( Compton and Hauck, 2002) .
Simulated Annealing: http://en.wikipedia.org/wiki/Simulated_annealing This algorithm is designed by looking at how the pure crystals form from a heated gaseous state while the system is cooled slowly( Arts and Krost, 1997). The computing problem is redesigned as a simulated annealing exercise and the solutions are arrived at. The working principle of simulated annealing is borrowed from metallurgy: a piece of metal is heated (the atoms are given thermal agitation), and then the metal is left to cool slowly. The slow and regular cooling of the metal allows the atoms to slide progressively their most stable ("minimal energy") positions. (Rapid cooling would have "frozen" them in whatever position they happened to be at that time.) The resulting structure of the metal is stronger and more stable. By simulating the process of annealing inside a computer program, we are able to find answers to difficult and very complex problems. Instead of minimizing the energy of a block of metal or maximizing its strength, the program minimizes or maximizes some objective relevant to the problem at hand.
Soft Computing: http://en.wikipedia.org/wiki/Soft_computing
One of the main components of “Lateral-computing” is the Soft-computing which approaches
problems with human information processing model ( Proc IEEE, 2001). The Soft Computing technique comprises of Fuzzy logic, neuro-computing, evolutionary-computing, machine learning and probabilistic-chaotic computing.
Neuro Computing: Instead of solving a problem by creating a non-linear equation model of it, the biological neural network analogy is used for solving the problem (Masters, 1995). The neural network is trained like a human brain to solve a given problem. This approach has become highly successful in solving some of the pattern recognition problems.
Evolutionary Computing: http://en.wikipedia.org/wiki/Evolutionary_computing The genetic algorithm(GA) resembles the natural evolution to provide an universal optimization( Goldberg, 2000). Genetic algorithms start with a population of chromosomes which represent the various solutions. The solutions are evaluated using a fitness function and a selection process determines which solutions are to be used for competition process. These algorithms are highly successful in solving search and optimization problems. The new solutions are created using evolutionary principles such as mutation and crossover Fuzzy Logic: http://en.wikipedia.org/wiki/Fuzzy_logic The fuzzy logic is based on the fuzzy sets concepts proposed by Lotfi Zadeh( Ross, 1997). The degree of membership concept is central to fuzzy sets. The fuzzy sets differ from crisp sets since they allow an element to belong to a set to a degree ( degree of membership). This approach finds good applications for control problems( Kosko, 1997). The Fuzzy logic has found enormous applications and has already found a big market presence in consumer electronics such as washing machines, microwaves, mobile phones, Televisions, Camcoders etc.
Probabilistic/Chaotic Computing Probabilistic computing engines, e.g. use of probabilistic graphical model such as Bayesian Network. Such computational techniques are referred to as randomization, yielding probabilistic algorithms. When interpreted as a physical phenomenon through classical statistical thermodynamics, such techniques lead to energy savings that are proportional to the probability p with which each primitive computational step is guaranteed to be correct (or equivalently to the probability of error, (1 –p)( Palem, 2003). Chaotic Computing is based on the chaos theory( Gleick, 1998).
Fractal Computing: http://en.wikipedia.org/wiki/Fractals Fractals are objects displaying self similarity at different scales( Mandelbrot, 1977). Fractals generation involves small iterative algorithms. The fractals have dimensions greater than their topological dimensions. The length of the fractal is infinite and size of it cannot be measured. It is described by an iterative algorithm unlike an Euclidean shape which is given by a simple formula. There are several types of fractals and Mandelbrot sets are very popular.
Fractals have found applications in image processing, image compression music generation, computer games etc. Mandelbrot set is a fractal named after its creator. Unlike the other fractals, even though the Mandelbrot set is self-similar at magnified scales, the small scale details are not identical to the whole. I.e., the Mandelbrot set is infinitely complex. But the process of generating it is based on an extremely simple equation. The Mandelbrot set M is a collection of complex numbers. The numbers Z which belong to M are computed by iteratively testing the Mandelbrot equation. C is a constant. If the equation converges for chosen Z, then Z belongs to M. Mandelbrot equation:
Randomized algorithm: http://en.wikipedia.org/wiki/Randomized_algorithms A randomized algorithm makes arbitrary choices during its execution. This allows a savings in execution time at the beginning of a program. The disadvantage of this method is the possibility that an incorrect solution will occur. A well-designed randomized algorithm will have a very high probability of returning a correct answer( Motwani and Raghavan, 1995). The two categories of randomized algorithms are: Monte Carlo Algorithm Las Vegas Algorithm
Consider an algorithm to find the kth element of an array. A deterministic approach would be to choose a pivot element near the median of the list and partition the list around that element. The randomized approach to this problem would be to choose a pivot at random, thus saving time at the beginning of the process. Like approximation algorithms, they can be used to more quickly solve tough NP-Complete problems. An advantage over the approximation algorithms, however, is that a randomized algorithm will eventually yield an exact answer if executed enough times
Machine Learning: http://en.wikipedia.org/wiki/Machine_Learning Human beings/animals learn new skills, languages/concepts Similarly Machine learning algorithms provide capability to generalize from training data( Mitchell, 1997). There are two classes of Machine Learning [ML]: Supervised ML Unsupervised ML
One of the well known machine learning technique is Back Propagation Algorithm( Masters, 1995). This mimics how humans learn from examples. The training patters are repeatedly presented to the network. The error is back propagated and the network weights are adjusted using gradient decent. The network converges through several hundreds of iterative computations
Support Vector Machine( Joachims, 2002): http://en.wikipedia.org/wiki/Support_vector_machines This is another class of highly successful machine learning techniques successfully applied to tasks such as text classification, speaker recognition, image recognition etc.
SOME EXAMPLE APPLICATIONS OF LATERAL-COMPUTING
There are several successful applications of Lateral-Computing techniques. Here we provide a small set of applications to illustrate the lateral computing: Bubble sorting: Here the computing problem of sorting is approached with an analogy of bubbles rising in water. This is by treating the numbers as bubbles and floating them to their natural position.
Truck back up problem: This is an interesting problem of reversing a truck and parking it at a particular location. The traditional computing techniques have found it difficult to solve this problem. This has been successfully solved by Fuzzy system( Kosko, 1997).
Balancing an inverted pendulum: This problem involves balancing and inverted pendulum. This problem has been efiiciently solved by neural networks and fuzzy systems( Kosko, 1997)
Smart volume control for Mobiles: The volume control in mobile phones depend on the background noise levels, noise classes, hearing profile of the user and other parameters. The measurement on noise level and loudness level involve imprecision and subjective measures. The authors have demonstrated the successful use of fuzzy logic system for volume control in mobile handsets( Suthikshn, 2003).
Optimization using Genetic algorithm and Simulated annealing: The problems such as traveling salesman problem have been shown to be NP complete problems( Garey and Johnson, 1979). Such problems are solved using algorithms which benefit by heauristics. Some of the applications are in VLSI routing, partitioning etc. Genetic algorithms and Simulated annealing have been successful in solving such optimization problems( Goldberg, 2000, Aarts and Krost, 1997).
Programming The Unprogrammable (PTU) involving the automatic creation of computer programs for unconventional computing devices such as cellular automata, multi-agent systems, parallel systems, field-programmable gate arrays, field-programmable analog arrays, ant colonies, swarm intelligence, distributed systems, and the like( Koza et al., 2003)
Fractal Ring Tone Generator: Using the Mandelbrot sets, the authors have demonstrated a successful application for Mobile ring tone generation( Suthikshn, 2004). Using simple Mandelbrot equation, the complex ring tone music is generated in real time by the mobile handset.
SUMMARY AND CONCLUSIONS
In this article, we have presented a review of lateral-computing techniques. Lateral- computing is based on the lateral-thinking approach and applies unconventional techniques to solve computing problems. While, most of the problems are solved in conventional techniques, there are problems which require lateral-computing. Lateral-computing provides advantage of computational effieciency, low cost of implementation, better solutions when compared to conventional computing for several problems. The lateral-computing successfully tackles a class of problems by exploiting tolerance for imprecision, uncertainity and partial truth to achieve tractability, robustness and low solution cost. Lateral –computing techniques which use the human like information processing models have been classified as “Soft Computing” in literature.
The Lateral-computing proves to be very valuable while solving numerous computing problems whose mathematical models are unavailable. They provide a way of developing innovative solutions resulting in smart systems with Very High Machine IQ( VHMIQ). This article has traced the transition from lateral-thinking to lateral-computing. Then several lateral-computing techniques have been described followed by their applications. Lateral -Computing is for building new generation artificial intelligence based on unconventional processing.
REFERENCES
E . DE BONO (1990): Lateral Thinking for Management: A Handbook, Penguin Books
E. DE BONO (1991): Water Logic, Penguin Books.
E. DE BONO ( 2003) : Website: http://www.edwarddebono.com/
PROCEEDINGS OF IEEE (2001): Special Issue on Industrial Innovations Using Soft Computing, September.
T.ROSS (1997): Fuzzy Logic With Engineering Applications, MCGraw-Hill Inc Publishers.
T. MASTERS (1995); Neural, Novel and Hybrid Algorithm for Time Series Prediction, John Wiley and Sons Publishers.
D.E. GOLDBERG (2000); Genetic Algorithms in search, optimization and Machine Learning, Addison Wesley Publishers.
B. KOSKO (1997); Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence, Prentice Hall Publishers.
B. KOSKO (1994); Fuzzy Thinking, Flamingo Publishers.
E.AARTS and J. KROST (1997); Simulated Annealing and Boltzmann Machines, John Wiley And Sons Publishers.
SUTHIKSHN KUMAR (2003); Smart Volume Tuner for Cellular Phones, IEEE Wireless Communications Magazine, June 2004, Vol 11, No.4, pp 44-49.
M. DORIGO, G DI CARO, L.M.GAMBERELLA (1999); Ant Algorithms for Discrete Optimization, Artificial Life, MIT Press.
N. PISANTI (1997); A Survey of DNA Computing, Technical Report TR-97-07, University of Pisa, Italy.
S. BRAUNSTEIN (1999); Quantum Computing, Wiley Publishers.
K. HWANG (1993); Advanced Computer Architecture: Parallelism, Scalability, Programmability, McGraw-Hill Book Co., New York, April.
K.V PALEM (2003); Energy Aware Computing through Probabilistic Switching: A study of limits, Technical Report GIT-CC-03-16 May 2003.
IAN FOSTER (1999); Computational Grids, Chapter 2 of The Grid: Blueprint for a New Computing Infrastructure, Technical Report.
M. SIMA, S. VASSILIADIS, S. COTOFONA, J. T. J. VAN EIJNDOVEN, and K. A. VISSERS (2000); A taxonomy of custom computing machines, in Proceedings of the Progress workshop, October.
J. GLEICK (1998); Choas: Making a New Science, Vintage Publishers.
B. MANDELBROT (1997); The Fractal Geometry of Nature, Freeman Publishers, New York.
D.R. HOFSTADTER (1999); Godel, Escher, Bach: An Eternal Golden Braid, Harper Collins Publishers.
L. FORTNOW (2003); Introduction of Quantum Computing from the computer science perspective and reviewing activities, Special issue on Quantum Information Technology, NEC Res and Dev Journal, Vol. 44, No. 3, Jul 2003, pp. 268-272.
M.A. KARIM and A.A.S. AWWAL (1992); Optical Computing: An Introduction, Wiley Publishers.
R.A. ALIEV and R.R. ALIEV (2001); Soft Computing and Its Applications, World Scientific Publishers.
JYH-SHING ROGER JANG, CHUEN-TSAI SUN & EIJI MIZUTANI (1997); Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, Prentice Hall Publishers.
JOHN R. KOZA, MARTIN A. KEANE, MATTHEW J. STREETER, WILLIAM MYDLOWEC, JESSEN YU, and GUIDO LANZA (2003); Genetic Programming IV: Routine Human-Competitive Machine Intelligence, Kluwer Academic.
JAMES ALLEN (1995); Natural Language Understanding, 2nd Edition, Pearson Education Publishers.
M.A. BODEN (1990); The Philosophy of Artificial Intelligence, Oxford University Press.
S. RUSSEL & P. NORVIG (2003); Artificial Intelligence: A Modern Approach, Prentice Hall Publishers.
R. HERKEN (1995); Universal Turing Machine, Springer-Verlag 2nd Edition.
HARRY R. LEWIS, CHRISTOS H. PAPADIMITROU (1997); Elements of Theory of Computation, 2nd edition, Prentice Hall Publishers.
M.GAREY and D. JOHNSON (1979); Computers and Intractability: A theory of NP Completeness, W.H. Freeman and Company Publishers.
M. SIPSER (2001); Introduction to the Theory of Computation, Thomson/Brooks/Cole Publishers.
K.COMPTON and S.HAUCK (2002); Reconfigurable Computing: A survey of Systems and Software, ACM Computing Surveys, Vo. 34, No.2, June 2002, pp. 171-210.
D.W. PATTERSON (1990); Introduction to Artificial Intelligence and Expert Systems, Prentice Hall Inc. Publishers.
E. CHARNIAK and D. MCDERMOTT (1999); Introduction to Artifical Intelligence, Addison Wesley.
F.H.HSU (2002); Behind Deep Blue: Building the Computer That Defeated the World Chess Champion, Princeton Univ Press.
E.BONABEAU, M. DORIGO, and G. THERAULUZ (1999); Swarm Intelligence: From Natural to Artificial Systems, Oxford University Press.
J.M. BRADSHAW (1997); Software Agents, AAAI Press/The MIT Press.
R. MURCH (2004); Autonomic Computing, Pearson Publishers.
S.R. HAMEROFF (1997); Ultimate Computing, Elsevier Science Publishers.
R.L. EPSTEIN and W.A. CARNIELLI (1989); Computability, Computable Functions, Logic and The Foundations of Mathematics, Wadsworth & Brooks/Cole Advanced Books and Software.
IBM (2004); http://www.research.ibm.com/autonomic/, http://www.ibm.com/autonomic/index.shtml
SUTHIKSHN KUMAR (1996); Reconfigurable Neurocomputers: Rapid Prototyping and Design Synthesis of Artificial Neural Networks for Field Programmable Gate Arrays, PhD Thesis, University of Melbourne, Australia.
T. JOACHIMS (2002); Learning to Classify Text using Support Vector Machines, Kluwer Academic Publishers.
T. MITCHELL (1997); Machine Learning, McGraw Hill Publishers.
SUTHIKSHN KUMAR (2004); Personalized Mobile Ring Tone Generator using Mandelbrot Music, AES 116th Convention, Berlin, Germany, 8-11 May.
R. MOTWANI and P. RAGHAVAN (1995); Randomized Algorithms, Cambridge International Series in Parallel Computation, Cambridge University Press.
SUN MICROSYSTEMS (2003); Introduction to Throughput Computing, Technical Report.
Conferences:
- First World Congress on Lateral Computing, WCLC 2004, IISc, Bangalore India, Dec 2004
- Second World Congress on Lateral Computing, WCLC 2005, PESIT, Bangalore, India