r
You have no items in your shopping cart.
What are artificial neural networks? What are their components and associated learning algorithms? What can they be used for? What are their properties and limitations? This book addresses the above questions, providing a clear introduction to neural networks to newcomers to the field who want to use them as well as understand the underlying principles and algorithms.
The authors of this book on artificial neural networks, who have team-taught the material in a one-semester course for more than 10 years, describe most of the basic neural network models (with several detailed solved examples) and discuss their rationale and relative advantages. Their approach requires little mathematical or technical background. Written from an algorithmic perspective, this text on artificial neural networks stresses links to contiguous fields and can serve as a first course for students in computer science as well as disciplines such as engineering, medicine, economics and management, where the goal is to learn how to develop practical applications using neural network tools.
The first chapter of Introduction presents the basic concepts and tackles important- yet rarely addressed - questions related to the use of neural networks in practical situations. The material is structured around classes of problems to which networks can be applied. Topics include supervised learning (single layer and multilayer networks), unsupervised learning, associative models, and optimization methods.
The most frequently used algorithms are introduced early on, right after perceptions, so that these can form the basis for initiating course projects. In this updated second edition, algorithms developed in the late 1990s are also included. Algorithms are presented using block-structured pseudo-code, and exercises are provided throughout.
INTRODUCTION: History of Neural Networks Structure and Function of a Single Neuron: Biological neurons, Artificial neuron models. Neural Net Architectures: Fully connected networks, Layered networks, Acyclic networks, Feedforward networks, Modular neural networks. Neural Learning: Correlation learning, Competitive learning, Feedback-based weight adaptation. What can Neural Networks be Used for? Classification, Clustering, Vector quantization, Pattern association, Function approximation, Forecasting, Control applications, Optimization, Search. Evaluation of Networks: Quality of results, Generalizability, Computational resources, Regularization. Implementation, Conclusion, Exercises
SUPERVISED LEARNING: Single Layer Networks: Perceptrons, Linear Separability, Perceptron Training Algorithm: Termination criterion Choice of learning rate Non-numeric inputs. Guarantee of Success, Modifications: Pocket algorithm, Adaline's, Multiclass discrimination. Support Vector Classification: Linearly non-separable classes. Conclusion, Exercises
SUPERVISED LEARNING: Multilayer Networks I: Multi-level Discrimination, Preliminaries: Architecture, Objectives. Backpropagation Algorithm, Classification using Backpropagation, Setting the Parameter Values: Initialization of weights, Frequency of weight updates, Choice of learning rate, Momentum, Generalizability, Controlling weight magnitudes, Number of hidden layers and nodes, Number of samples. Theoretical Results: Cover's Theorem, Representations of functions, Approximations of functions. Accelerating the Learning Process: Quickprop algorithm Conjugate gradient. Multiclass Problems: The Bayes classifier, One against the rest, All-pairs pairwise classification, Nearest neighbor classification. Applications: Weaning from mechanically assisted ventilation, Classification of electromyographic signals, Forecasting commodity prices, Controlling a gantry crane. Conclusion, Exercises
SUPERVISED LEARNING: Multilayer Networks II: Adaptive Multilayer Networks: Network pruning algorithms, Network growing algorithms. Boosting, Prediction Networks: Recurrent networks Feedforward networks for forecasting. Radial Basis Functions, Support Vector Machines: Support vector regression. Probabilistic Neural Network, Polynomial Networks, Conclusion, Exercises
UNSUPERVISED LEARNING: Winner-Take-All Networks: Hamming networks, Maxnet, Simple competitive learning. Learning Vector Quantizers, Counterpropagation Networks, Adaptive Resonance Theory, Topologically Organized Networks: Self-organizing maps, Convergence, Extensions.Distance-based Learning: Maximum entropy, Neural Gas. Principal Component Analysis Networks: Support vector approach for PCA. Conclusion, Exercises
ASSOCIATIVE LEARNING: Non-iterative Procedures for Association, Hopfield Networks: Discrete Hopfield networks, Storage capacity of Hopfield networks, Continuous Hopfield networks. Optimization Using Hopfield Networks: Traveling salesperson problem, Solving simultaneous linear equations, Allocating documents to multiprocessors. Brain-State-in-a-BOX Network, Boltzmann Machines: Simulated annealing, Boltzmann machine learning algorithm, Mean field annealing. Hetero-associators, Conclusion, Exercises
EVOLUTIONARY OPTIMIZATION: Optimization and Search: Iterated gradient descent, Random search. Evolutionary Computation: Initialization, Termination criterion, Reproduction, Operators, Replacement, Schema Theorem. Evolutionary Algorithms for Training Neural Networks, Learning Connection Weights: Representation, Mutation, Recombination, Evolution using past history. Learning Architecture, Evolving Multiple Modules, Evolving Neurons, Hybrid Evolutionary Approaches, Conclusion, Exercises
APPENDICES: A Little Mathematics: Calculus, Linear Algebra, Statistics, Optimization, Vapnik-Chervonenkis Dimension. Data: Iris Data, Classification of Myoelectric Signals, Gold Prices, Clustering Animal Features, 3-D Corners, Grid and Approximation, Eleven-City Traveling Salesperson Problem, Daily Stock Prices of Three Companies.
Kishan Mehrotra, Professor, Department of EECS, LCS College of Engineering and Computer Science, Syracuse University, Syracuse NY
For Further Details Click Here!
Chilukuri K.Mohan, Professor and Department Chair for Electrical Engineering and Computer Science , Syracuse University, Syracuse NY
For Further Details Click Here!
Sanjay Ranka, Professor, Department of Computer Information Science and Engineering , University of Florida
Under Graduate and Post Graduate Students of Computer Science and Electronics Engineering.
REVIEWS......
........ "I found the book Elements of Artificial Neural Networks easy to access, pleasant to read, with a good repertoire of Problems and well planned. Chapter 3 on Supervised Learning: Multi-layer Network and Chapter 5 on Unsupervised earning are particularly well written, which take the reader through the foundational ideas. Anybody interested in ANN should have this book. ......... Professor Dr. Pushpak Bhattacharya Department of Computer Science and Engineering, IIT Bombay
........."Elements of Artificial Neural Networks Book is appropriate as a text for a senior level class for engineering and/or computer science students. It is also likely to be used by students in economics and management. The authors have done a very good job in describing many of the popular network structures, with several detailed solved examples. The lucid writing style makes the book accessible to a wide range of students and fills the need for a sound engineering oriented senior-level text in this exciting area." .......................... Joydeep Ghosh, Professor and Endowed Engineering Foundation Fellow.