Pattern recognition (Record no. 3612)

MARC details
000 -LEADER
fixed length control field 11658nam a22002057a 4500
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9789380931623
040 ## - CATALOGING SOURCE
Transcribing agency CUS
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 006.4
Item number THE/P
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Theodoridis, Sergios
245 ## - TITLE STATEMENT
Title Pattern recognition
Statement of responsibility, etc. Sergios Theodoridis and Konstantinos Koutroumbas
250 ## - EDITION STATEMENT
Edition statement 4th ed.
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT)
Place of publication, distribution, etc. Amsterdam:
Name of publisher, distributor, etc. Elsevier,
Date of publication, distribution, etc. 2009.
300 ## - PHYSICAL DESCRIPTION
Extent xvii, 961 p.
Other physical details ill.
505 ## - FORMATTED CONTENTS NOTE
Formatted contents note 2.5.1 Maximum Likelihood Parameter Estimation 34<br/>2.5.2 Maximum « Posferfon Probability Estimation 38<br/>2.5.3 Bayesian Inference 39<br/>2.5.4 Maximum Entropy Estimation 43<br/>2.5.5 Mixture Models 44<br/>2.5.6 Nonparametric Estimation 49<br/>2.5.7 The Naive-Bayes Classifier 59<br/>The Nearest Neighbor Rule 61<br/>Bayesian Networks 64<br/>Problems 71<br/>References 86<br/>Linear Classifiers 91<br/>Introduction 91<br/>Linear Discriminant Functions and Decision<br/>Hyperplanes 91<br/>Tlie Perceptron Algorithm 93<br/>Least Squares Methods IO3<br/>3.4.1 Mean Square Error Estimation 103<br/>3.4.2 Stochastic Approximation and the LMS Algorithm .. 105<br/>3.4.3 Sum of Error Squares Estimation 108<br/>3.5 Mean Square Estimation Revisited 110<br/>3.5.1 Mean Square Error Regression 110<br/>3.5.2 MSE Estimates Posterior Class Probabilities 112<br/>3.5.3 The Bias-Variance Dilemma 114<br/>3.6 Logistic Discrimination 117<br/>3.7 Support Vector Machines II9<br/>3.7.1 Separable Classes II9<br/>3.7.2 Nonseparable Classes 124<br/>3.7.3 The Multiclass Case 127<br/>3.7.4 /^-SVM 133<br/>3.7.5 Support Vector Machines: A Geometric<br/>Viewpoint I36<br/>3.7.6 Reduced Convex Hulls 138<br/>3.8 Problems 142<br/>References 147<br/>CHAPTER 4 Nonlinear Classifiers 151<br/>4.1 Introduction 151<br/>4.2 The XOR Problem 151<br/>4.3 Tlie Two-Layer Perceptron 153<br/>4.3.1 Classification Capabilities of the Two-Layer<br/>Perceptron 156<br/>4.4 Three-Layer Perceptrons 158<br/>4.5 Algoritlims Based on Exact Classification of the<br/>Training Set l60<br/>4.6 The Backpropagation Algorithm l62<br/>4.7 Variations on the Backpropagation Theme 169<br/>4.8 Tlie Cost Function Choice 172<br/>4.9 Choice of the Network Size 176<br/>4.10 A Simulation Example 181<br/>4.11 Networks with Weight Sharing 183<br/>4.12 Generalized Linear Classifiers 185<br/>4.13 Capacity of the /-Dimensional Space in<br/>Linear Dichotomies<br/>4.14 Polynomial Classifiers 189<br/>4.15 Radial Basis Function Networks ^90<br/>4.16 Universal Approximators 19^<br/>4.17 Probabilistic Neural Networks ^96<br/>4.18 Support Vector Machines: The Nonlinear Case 198<br/>4.19 Beyond the SVM Paradigm 203<br/>4.19.1 Expansion in Kernel Functions and Model<br/>Sparsification 205<br/>4.19.2 Robust Statistics Regression 211<br/>4.20 Decision Trees 215<br/>4.20.1 Set of Questions 218<br/>4.20.2 Splitting Criterion 218<br/>4.20.3 Stop-Splitting Rule 219<br/>4.20.4 Class Assignment Rule 219<br/>4.21 Combining Classifiers 222<br/>4.21.1 Geometric Average Rule 223<br/>4.21.2 Arithmetic Average Rule 224<br/>4.21.3 Majority Voting Rule 225<br/>4.21.4 A Bayesian Viewpoint 227<br/>4.22 The Boosting Approach to Combine Classifiers 230<br/>4.23 The Class Imbalance Problem 237<br/>4.24 Discussion 239<br/>4.25 Problems 240<br/>References 249<br/>CHAPTER 5 Feature Selection 261<br/>5.1 Introduction 261<br/>5.2 Preprocessing 262<br/>5.2.1 Outlier Removal 262<br/>5.2.2 Data Normalization 263<br/>5.2.3 Missing Data 263<br/>5.3 The Peaking Phenomenon 265<br/>5.4 Feature Selection Based on Statistical<br/>Hypothesis Testing 268<br/>5.4.1 Hypothesis Testing Basics 268<br/>5.4.2 Application of the f-Test in Feature Selection 273<br/>5.5 The Receiver Operating Characteristics (ROC) Curve 275<br/>5.6 Class Separability Measures 276<br/>5.6.1 Divergence 276<br/>5.6.2 Chernoff Bound and Bhattacharyya Distance 278<br/>5.6.3 Scatter Matrices 280<br/>5.7 Feature Subset Selection 283<br/>5.7.1 Scalar Feature Selection 283<br/>5.7.2 Feature Vector Selection 284<br/>5.8 Optimal Feature Generation 288<br/>5.9 Neural Networks and Feature Generation/Selection 298<br/>5.10 A Hint On Generalization Theory 299<br/>5.11 The Bayesian Information Criterion 309<br/>5.12 Problems 311<br/>References 318<br/>CHAPTER 6 Feature Generation I: Data Transformation and<br/>Dimensionality Reduction 323<br/>6.1 Introduction 323<br/>6.2 Basis Vectors and Images 324<br/>6.3 The Karhunen-Loeve Transform 326<br/>6.4 The Singular Value Decomposition 335<br/>6.5 Independent Component Analysis 342<br/>6.6<br/>6.5.1 ICA Based on Second- and Fourth-Order<br/>Cumulants 344<br/>6.5.2 ICA Based on Mutual Information 345<br/>6.5.3 An ICA Simulation Example 348<br/>Nonnegative Matrix Factorization 349<br/>6.7 Nonlinear Dimensionality Reduction 35O<br/>6.8<br/>6.9<br/>6.10<br/>6.7.1 Kernel PCA 35I<br/>6.7.2 Graph-Based Methods 353<br/>The Discrete FourierTransform (DFT) 363<br/>6.8.1 One-Dimensional DFT 364<br/>6.8.2 Two-Dimensional DFT 366<br/>The Discrete Cosine and Sine Transforms 366<br/>The Hadamard Transform 368<br/>6.11 The Haar Transform 369<br/>6.12 The Haar Expansion Revisited 37I<br/>6.13 Discrete Time Wavelet Transform (DTWT) 375<br/>6.14 The Multiresolution Interpretation 384<br/>6.15 Wavelet Packets jg7<br/>6.16 A Look atTwo-Dimensional Generalizations 388<br/>0-17 Applications<br/>6.18 Problems<br/>References<br/>CHAPTER 7 Feature Generation II 4..<br/>7.1 Introduction<br/>T.2 Regional Features | ^'" * * ^ ^ 2<br/>7.2 2 Locl^ forTexture Characterization 412<br/>Exttactir f^^Texture Feature<br/>7-2.3 Moments<br/>■7-2.4 Parametric Models<br/>7.3 Features for Shape and Size Characterization 435<br/>7.3.1 Fourier Features 436<br/>7.3.2 Chain Codes 439<br/>7.3.3 Moment-Based Features 441<br/>7.3.4 Geometric Features 442<br/>7.4 A Glimpse at Fractals 444<br/>7.4.1 Self-Similarity and Fractal Dimension 444<br/>7.4.2 Fractional Brownian Motion 446<br/>7.5 Typical Features for Speech and Audio Classification 451<br/>7.5.1 ShortTime Processing of Signals 452<br/>7.5.2 Cepstrum 455<br/>7.5.3 The Mel-Cepstrum 457<br/>7.5.4 Spectral Features 460<br/>7.5.5 Time Domain Features 462<br/>7.5.6 An Example 463<br/>7.6 Problems 466<br/>References 473<br/>CHAPTER 8 Template Matching 48i<br/>8.1 Introduction 481<br/>8.2 Measures Based on Optimal Path Searching Techniques 482<br/>8.2.1 Bellman's Optimality Principle and Dynamic<br/>Programming 484<br/>8.2.2 The Edit Distance 487<br/>8.2.3 Dynamic Time Warping in Speech Recognition 491<br/>8.3 Measures Based on Correlations 498<br/>8.4 Deformable Template Models 504<br/>8.5 Content-Based Information Retrieval:<br/>Relevance Feedback 5O8<br/>8.8 Problems 513<br/>References<br/>CHAPTER 9 Context-Dependent Classification 521<br/>9.1 Introduction<br/>9.2 The Bayes Classifier 52i<br/>9.3 Markov Chain Models 522<br/>9.4 The Viterbi Algorithm 523<br/>9.5 Channel Equalization 527<br/>9.6 Hidden Markov Models 552<br/>9.7 HMM with State Duration Modeling 545<br/>9.8 Training Markov Models via Neural Networks 552<br/>9.9 A discussion of Markov Random Fields 554<br/>9.10 Problems 556<br/>References 560<br/>CHAPTER 10 3L!r)';i v'ssfJ Learning; The Epilogue 567<br/>10.1 Introduction 567<br/>10.2 Error-Counting Approach 568<br/>10.3 Exploiting the Finite Size of the Data Set 569<br/>10.4 A Case Study from Medical Imaging 573<br/>10.5 Semi-Supervised Learning 577<br/>10.5.1 Generative Models 579<br/>10.5.2 Graph-Based Methods 582<br/>10.5.3 Transductive Support Vector Machines 586<br/>10.6 Problems 590<br/>References 591<br/>CHAPTER 11 Clustering; Basic Concepts 595<br/>11.1 Introduction 595<br/>11.1.1 Applications of Cluster Analysis 598<br/>11.1.2 Types of Features 599<br/>11.1.3 Definitions of Clustering 600<br/>11.2 Proximit}' Measures 602<br/>11.2.1 Definitions 602<br/>11.2.2 Proximity Measures between Two Points 604<br/>11.2.3 Proximity Functions between a Point and a Set 6l6<br/>11.2.4 Proximity Functions between Two Sets 620<br/>11.3 Problems 622<br/>References 624<br/>CHAPTER 12 Clustering Algorithms I: Sequential Algorithms 627<br/>12.1 Introduction 627<br/>12.1.1 Number of Possible Clusterings 627<br/>12.2 Categories of Clustering Algorithms 629<br/>12.3 Sequential Clustering Algorithms 633<br/>Estimation of the Number of Clusters 635<br/>12.4 A Modification of BSAS 637<br/>12.5 A Two-Threshold Sequential Scheme 638<br/>"2.6 Refinement Stages 641<br/>Neural Network Implementation 643<br/>12^^ of the Architecture 643<br/>^•2 Implementation of the BSAS Algoritlim 644<br/>12.8 Problems 646<br/>References 650<br/>CHAPTER 13 Clustering Algorithms II: Hierarchical Algorithms 653<br/>13.1 Introduction 653<br/>13.2 Agglomerative Algorithms 654<br/>13.2.1 Definition of Some Useful Quantities 655<br/>13.2.2 Agglomerative Algorithms Based on Matrix Theory . 658<br/>13.2.3 Monotonicity and Crossover 664<br/>13.2.4 Implementational Issues 667<br/>13.2.5 Agglomerative Algorithms Based on Graph Theory.. 667<br/>13.2.6 Ties in the Proximity Matrix 676<br/>13.3 The Cophenetic Matrix 679<br/>13.4 Divisive Algorithms 680<br/>13.5 Hierarchical Algorithms for Large Data Sets 682<br/>13.6 Choice of the Best Number of Clusters 690<br/>13.7 Problems 693<br/>References 697<br/>CHAPTER 14 Clustering Algorithms III: Schemes Based on<br/>Function Optimization 701<br/>14.1 Introduction 701<br/>14.2 Mixture Decomposition Schemes 703<br/>14.2.1 Compact and Hyperellipsoidal Clusters 705<br/>14.2.2 A Geometrical Interpretation 709<br/>14.3 Fuzzy Clustering Algorithms 712<br/>14.3.1 Point Representatives 716<br/>14.3.2 Quadric Surfaces as Representatives 718<br/>14.3.3 Hyperplane Representatives 728<br/>14.3.4 Combining Quadric and Hyperplane<br/>Representatives 731<br/>14.3.5 A Geometrical Interpretation 732<br/>14.3.6 Convergence Aspects of the Fuzzy Clustering<br/>Algorithms 732<br/>14.3.7 Alternating Cluster Estimation 733<br/>14.4 Possibilistic Clustering 733<br/>14.4.1 The Mode-Seeking Property 737<br/>14.4.2 An Alternative Possibilistic Scheme 739<br/>14.5 Hard Clustering Algorithms 739<br/>14.5.1 The Isodata or k-Means or c-Means Algorithm 741<br/>14.5.2 ife-Medoids Algorithms 745<br/>14.8 Vector Quantization 749<br/>14.7 Problems 752<br/>References 758<br/>CHAPTER 15 Clustering Algorithms IV 765<br/>15.1 Introduction 765<br/>15.2 Clustering Algorithms Based on Graph Theory 765<br/>15.2.1 Minimum Spanning Tree Algorithms 766<br/>15.2.2 Algorithms Based on Regions of Influence 768<br/>15.2.3 Algorithms Based on Directed Trees 770<br/>15.2.4 Spectral Clustering 772<br/>15.3 Competitive Learning Algorithms 780<br/>15.3 .1 Basic Competitive Learning Algorithm 782<br/>15.3.2 Leaky Learning Algorithm 783<br/>15.3.3 Conscientious Competitive Learning Algoritlims— 784<br/>15.3.4 Competitive Learning-Like Algorithms Associated<br/>with Cost Functions 785<br/>15.3.5 Self-Organizing Maps 786<br/>15.3.6 Supervised Learning Vector Quantization 788<br/>15.4 Binary Morphology Clustering Algorithms (BMCAs) 789<br/>15.4.1 Discretization 790<br/>15.4.2 Morphological Operations 791<br/>15.4.3 Determination of the Clusters in a Discrete Binary<br/>Set 794<br/>15.4.4 Assignment of Feature Vectors to Clusters 795<br/>15.4.5 Tlie Algorithmic Scheme 796<br/>15.5 Boundary Detection Algorithms 798<br/>15.6 Valley-Seeking Clustering Algorithms 801<br/>15.7 Clustering via Cost Optimization (Revisited) 803<br/>15.7.1 Branch and Bound Clustering Algorithms 803<br/>15.7.2 Simulated Annealing 807<br/>15.7.3 Deterministic Annealing 808<br/>15.7.4 Clustering Using Genetic Algoritlims 810<br/>15.8 Kernel Clustering Methods 811<br/>15.9 Density-Based Algorithms for Large Data Sets 815<br/>15.9.1 The DBSCAN Algorithm 815<br/>15.9.2 TheDBCLASDAlgoritlim 818<br/>15.9.3 The DENCLUE Algorithm 819<br/>15.10 Clustering Algorithms for High-Dimensional Data Sets 821<br/>15.10.1 Dimensionality Reduction Clustering Approach — 822<br/>15.10.2 Subspace Clustering Approach 824<br/>15.11 Other Clustering Algorithms 837<br/>15.12 Combination of Clusterings 839<br/>15.13 Problems 846<br/>References 852<br/>CHAPTER 16 Cluster Validity 863<br/>16.1 Introduction 863<br/>16.2 Hypothesis Testing Revisited 864<br/>16.3 Hypothesis Testing in Cluster Validity 866<br/>16.3.1 External Criteria 868<br/>16.3.2 Internal Criteria 873<br/>16.4 Relative Criteria 877<br/>16.4.1 Hard Clustering 880<br/>16.4.2 Fuzzy Clustering 887<br/>16.5 Validity of Individual Clusters 893<br/>16.5.1 External Criteria 894<br/>16.5.2 Internal Criteria 894<br/>16.6 Clustering Tendency 896<br/>16.6.1 Tests for Spatial Randomness 900<br/>16.7 Problems 905<br/>References 909
650 ## - SUBJECT
Keyword Pattern recognition
650 ## - SUBJECT
Keyword Biometrics
650 ## - SUBJECT
Keyword Bioinformatics
650 ## - SUBJECT
Keyword Dynamic programming
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type GN Books
Holdings
Withdrawn status Lost status Damaged status Not for loan Home library Current library Shelving location Date acquired Full call number Accession number Date last seen Koha item type
        Central Library, Sikkim University Central Library, Sikkim University General Book Section 26/06/2016 006.4 THE/P P35938 26/06/2016 General Books
SIKKIM UNIVERSITY
University Portal | Contact Librarian | Library Portal

Powered by Koha