Introduction to artificial neural systems (Record no. 3349)
[ view plain ]
000 -LEADER | |
---|---|
fixed length control field | 07393nam a2200169 4500 |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER | |
International Standard Book Number | 8172246501 |
040 ## - CATALOGING SOURCE | |
Transcribing agency | CUS |
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER | |
Classification number | 006.32 |
Item number | ZUR/I |
100 ## - MAIN ENTRY--PERSONAL NAME | |
Personal name | Zurada, Jacek M. |
245 ## - TITLE STATEMENT | |
Title | Introduction to artificial neural systems |
Statement of responsibility, etc. | Jacek M. Zurada |
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT) | |
Place of publication, distribution, etc. | Ahmedabad: |
Name of publisher, distributor, etc. | Jaico Publishing House, |
Date of publication, distribution, etc. | 2006. |
300 ## - PHYSICAL DESCRIPTION | |
Extent | xxiv, 683, A1-I19 p. |
Other physical details | ill. |
505 ## - FORMATTED CONTENTS NOTE | |
Formatted contents note | 1 Artificial Neural Systems; Preliminaries<br/>1.1 Neural Computation; Some Examples and Applications<br/>Classifiers, Approximators, and Autonomous Drivers 4<br/>Simple Memory and Restoration of Patterns 10<br/>Optimizing Networks 14<br/>Clustering and Feature Detecting Networks 16<br/>1.2 History of Artificial Neural Systems Development 17<br/>U Future Outlook 21<br/>References 22<br/>Fundamental Concepts and Models of Artificial<br/>Neural Systems 25<br/>2.1 Biological Neurons and Their Artificial Models 26<br/>Biological Neuron 27<br/>McCulloch-Pitts Neuron Model 30<br/>Neuron Modeling for Artificial Neural Systems 31<br/>2.2 Models of Artificial Neural Networks 37<br/>Feedforward Network 37<br/>Feedback Network 42<br/>2.3 Neural Processing 53<br/>2.4 Learning and Adaptation 55<br/>Learning as Approximation or Equilibria Encoding 55<br/>Supervised and Unsupervised Learning 56<br/>2.5 Neural Network Learning Rules 59<br/>Hebbian Learning Rule 60<br/>Perceptron Learning Rule 64<br/>Delta Learning Rule 66<br/>Widrow-Hoff Learning Rule 69<br/>Correlation Learning Rule 69<br/>Winner-Take-All Learning Rule 70<br/>Outstar Learning Rule 71<br/>Summary of Learning Rules 72<br/>2.6 Overview of Neural Networks 74<br/>2.7 Concluding Remsfirks 76<br/>Problems 78<br/>References 89<br/>3 Single-Layer Perceptron Classifiers<br/>3.1 Classification Model, Features, and Decision Regions 94<br/>3.2 Discriminant Functions 99<br/>3.3 Linear Machine and Minimum Distance Classificatidn 106<br/>3.4 Nonparametric IVaining Concept 114<br/>3.5 IVaining and Classification Using the Discrete Perceptron:<br/>Algorithm and Example •. 120<br/>3.6 Single-Layer Confiimous Perceptron Networks for Linearly<br/>Separable Classifications 132<br/>^ ^ Muitlcategory Single-Layer Perceptron Networks 142<br/>93<br/>3.8 Concluding Remarks 152<br/>Problems 153<br/>References 161<br/>4 Multilayer Feedforward Networks 163<br/>4.1 Linearly Nonseparable Pattern Classification 165<br/>4.2 Delta Learning Rule for Multiperceptron Layer 175<br/>43 Generalized Delta Learning Rule 181<br/>4.4 Feedforward Recall and Error Back-Propagation IVaining 185<br/>Feedforward Recall 185<br/>Error Back-Propagation Training 186<br/>Example of Error Back-Propagation Training 190<br/>Training Errors 195<br/>Multilayer Feedforward Networks as<br/>Universal Approximators 196<br/>4.5 Learning Factors 206<br/>Initial Weights -208<br/>Cumiilative Weight Adjustment versus<br/>Increniental Updating 208<br/>Steepness of the Activation Function 209<br/>Learning Constant 210<br/>Momentum Method 211<br/>Network Architectures Versus Data Representation 214<br/>Necessary Number of Hidden Neurons 216<br/>4.6 Classifying and Expert Layered Networks 220<br/>Character Recognition Application 221<br/>Expert Systems Applications 225<br/>Learning Time Sequences 229<br/>4.7 Functional Link Networks 230<br/>4.8 Concluding Remarks 234<br/>Problems 235<br/>References 248<br/>5 single-Layer Feedback Networks 251<br/>5.1 Basic Concepts of Dynamical Systems 253<br/>5.2 Mathematical Foundations of Discrete-Time<br/>Hopfield Networks 254<br/>5.3 Mathematical Foundations of (Jradient-Type<br/>Hopficid Networks 264<br/>5.4 Transient Response of Continuous-Time Networks 276<br/>5.5 Relaxation Modeling; in .Single-Layer Feedback Networks 283<br/>5.6 Example Solutions of Optimization Problems 287<br/>Summing Network with Digital Outputs 287<br/>Minimization of the Traveling Salesman Tour Length 2^4<br/>5.7 Concluding Remarks 299<br/>Problems 301<br/>References 310<br/>6 Associative Memories 313<br/>6.1 Basic Concepts 314<br/>6.2 Linear Associator 320<br/>6.3 Basic Concepts of Recurrent Autoassociative Memory 325<br/>Retrieval Algorithm 327<br/>Storage Algorithm 328<br/>Performance Considerations 336<br/>6.4 Performance Analysis of Recurrent<br/>Autoassociative Memory 339<br/>Energy Function Reduction 342<br/>Capacity of Autoassociative Recurrent Memory 343<br/>Memory Convergence versus Corruption 345<br/>Fixed Point Concept 349<br/>Modified Memory Convergent Toward Fixed Points 351<br/>Advantages and Limitations 354<br/>6.5 Bidirectional Associative Memory 354<br/>Memory Architecture 355<br/>Association Encoding and Decoding 357<br/>Stability Considerations 359<br/>Memory Example and Performance Evaluation 360<br/>Improved Coding of Memories 363<br/>Multidirectional Associative Memory 368<br/>Associative Memory of Spatio-temporal Patterns 370<br/>5 7 Concluding Remarks 375<br/>Probleins 377<br/>References 386<br/>7 Matching and Self-Organizing Networks 389<br/>7.1 Hamming Net and MAXNET 391<br/>7.2 Unsupervised Learning of Clusters 399<br/>Clustering and Similarity Measures 399<br/>Winner-Take-AII Learning 401<br/>Recall Mode 406<br/>Initialization of Weights 406<br/>Separability Limitations 409<br/>7.3 Counterpropagation Network 410<br/>7.4 Feature Mapping 414<br/>7.5 Self-Organizing Feature Maps 423<br/>7.6 Cluster Discovery Network (ART!) 432<br/>7.7 Concluding Remarks 444<br/>Problems 445<br/>References 452<br/>8 Applications of Neural Algorithms and Systems 455<br/>8.1 Linear Programming Modeling Network 456<br/>8.2 Character Recognition Networks 464<br/>Multilayer Feedforward Network for Printed<br/>Character Classification 464<br/>Handwritten Digit Recognition: Problem Statement 476<br/>Recognition Based on Handwritten Character Skeletonization 478<br/>Recognition of Handwritten Characters Based on Error<br/>Back-propagation Training 482<br/>SJ Neural Networks Control Applications 485<br/>Overview of Control Systems Concepts 485<br/>Process Identification 489<br/>Basic Nondynamic Learning Control Architectures 494<br/>Inverted Pendulum Neurocontroller 499<br/>Cerebellar Model Articulation Controller 504<br/>Concluding Remarks 511<br/>8.4 Networks for Robot Kinematics 513<br/>Overview of Robot Kinematics Problems 514<br/>Solution of the Forward and Inverse Kinematics Problems 516<br/>Comparison of Archilcctures for the Forward<br/>Kinematics Problem 519<br/>Target Position Learning 523<br/>8.5 Connectionist Expert Systems for Medical Diagnosis 527<br/>F.xpert System for Skin Diseases Diagnosis 528<br/>Lxpert System for Low Back Pain Diagnosis 532<br/>Expert System for Coronary Occlusion Diagnosis 537<br/>Concluding Remarks 539<br/>8.6 Self-Organizing Semantic Maps 539<br/>8.7 Concluding Remarks 546<br/>Problems 548<br/>References 559<br/>Neural Networks Implementation<br/>9.1 Artificial Neural Systems; Overview of Actual Models 566<br/>Node Numbers and Complexity of Computing Systems 567<br/>Neurocomputing Hardware Requirements 569<br/>Digital and Analog Electronic Neurocomputing Circuits 575<br/>9.2 Integrated Circuit Synaptic Connections 585<br/>Voltage-controlled Weights 587<br/>Analog Storage of Adjustable Weights 592<br/>Digitally Programmable Weights 595<br/>Learning Weight Implementation 605<br/>9.3 Active Building Blocks of Neural Networks 608<br/>Current Mirrors 610<br/>Inverter-based Neuron 613<br/>Differential Voltage Amplifiers 617<br/>Scalar Product and Averaging Circuits with<br/>Transconductance Amplifiers 624<br/>Current Comparator 626<br/>Template Matching Network 628<br/>9.4 Analog MuHipUors and Scalar Prodnc. Circn.ts 630<br/>Depletion MOSFET Circuit 631<br/>Enhancement Mode MOS Circuit 636<br/>Analog Multiplier with Weight Storage 638<br/>Floating-G3'® Transistor Multipliers 640<br/><br/>9.5 Associative Memory Implementations 644<br/>9.6 Electronic Neural Processors 652<br/>9.7 Concluding Remarks 663<br/>Problems 666<br/>References 679 |
650 ## - SUBJECT | |
Keyword | Computer algorithms |
650 ## - SUBJECT | |
Keyword | Artificial neural systems |
942 ## - ADDED ENTRY ELEMENTS (KOHA) | |
Koha item type | GN Books |
Withdrawn status | Lost status | Damaged status | Not for loan | Home library | Current library | Shelving location | Date acquired | Full call number | Accession number | Date last seen | Koha item type |
---|---|---|---|---|---|---|---|---|---|---|---|
Central Library, Sikkim University | Central Library, Sikkim University | General Book Section | 22/06/2016 | 006.32 ZUR/I | P20706 | 22/06/2016 | General Books |