Saturday, April 24, 2010

Neural Networks Theory



Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
I.1 Neural Computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
I.2 Position of Neural Computers in the Set of Large-Powered Computing Facilities . . . 5
I.3 The Concept of Computer Universalism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
I.4 Neural Computer Modularity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
I.5 The Class of Problems Adequate to Neural Computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
I.6 Methods of Coefficient Readjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
I.7 Neural Computer Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
I.8 Some Remarks Concerning the Neural Computer Elemental Base . . . . . . . . . . . . . . . . . 14
I.9 Neural Mathematics – Methods and Algorithms of Problem Solving
Using Neurocomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
I.10 About Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
I.10.1 Neural Network Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
I.10.2 Investigation of Neural Network Input Signal Characteristics . . . . . . . . . . . . . 24
I.10.3 About the Selection of Criteria for Primary Neural Network Optimization 24
I.10.4 Analysis of Open-Loop Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
I.10.5 Algorithms for a Multivariable Functional Extremum Search
and Design of Adaptation Algorithms in Neural Networks . . . . . . . . . . . . . . . . 25
I.10.6 Investigation of Neural Network Adaptation Algorithms . . . . . . . . . . . . . . . . . . . 27
I.10.7 Multilayer Neural Networks with Flexible Structure . . . . . . . . . . . . . . . . . . . . . . . . 28
I.10.8 Informative Feature Selection in Multilayer Neural Networks . . . . . . . . . . . . . 28
I.10.9 Investigation of Neural Network Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
I.10.10 Neural Network Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
I.11 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
A.1 Theory of Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
A.2 Neural Computer Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
A.3 Neural Computer Elemental Base . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Part I · The Structure of Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
1 Transfer from the Logical Basis of Boolean Elements “And, Or, Not”
to the Threshold Logical Basis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
1.1 Linear Threshold Element (Neuron) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
1.2 Multi-Threshold Logics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
1.3 Continuous Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
1.4 Particular Forms of Activation Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
2 Qualitative Characteristics of Neural Network Architectures . . . . . . . . . . . . . . . . . . . . . 43
2.1 Particular Types of Neural Network Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.2 Multilayer Neural Networks with Sequential Connections . . . . . . . . . . . . . . . . . . . . . . . . . . 45
2.3 Structural and Symbolic Description of Multilayer Neural Networks . . . . . . . . . . . . . 47
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3 Optimization of Cross Connection Multilayer Neural Network Structure . . . . . . 53
3.1 About the Problem Complexity Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
3.2 One-Dimensional Variant of the Neural Network with Cross Connections . . . . . . . 54
3.3 Calculation of Upper and Lower Estimation of the Number of Regions . . . . . . . . . . . 55
3.4 Particular Optimization Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
3.5 Structural Optimization by Some Main Topological Characteristics . . . . . . . . . . . . . . . 60
3.6 Optimization of a Multilayer Neural Network Structure with Kp Solutions . . . . . . . 64
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4 Continual Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.1 Neurons with Continuum Input Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.2 Continuum of Neurons in the Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
4.3 Continuum Neurons in the Layer and Discrete Feature Set . . . . . . . . . . . . . . . . . . . . . . . . . 68
4.4 Classification of Continuum Neuron Layer Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.4.1 Discrete Set of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.4.2 One-Dimensional and Two-Dimensional m2 Feature Space . . . . . . . . . . . . . . . 69
4.4.3 Continuum of Features – One-Dimensional m1 for Several Channels . . . . 71
4.4.4 Feature Continuum – Two-Dimensional m1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
4.4.5 Neuron Layer with a Continuum of Output Values . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Part II · Optimal Models of Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
5 Investigation of Neural Network Input Signal Characteristics . . . . . . . . . . . . . . . . . . . . 77
5.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
5.2 Joint Probability Distribution of the Input Signal for Two Pattern Classes . . . . . . . . 79
5.3 Joint Distribution Law for the Input Signal Probabilities
in the Case of K Classes of Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6 Design of Neural Network Optimal Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
6.1 General Structure of the Optimal Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
6.2 Analytical Representation of Divisional Surfaces in Typical Neural Networks . . . 90
6.3 Optimal Neural Network Model for Multidimensional Signals e(n) and y(n) . . 110
6.4 A Priori Information about the Input Signal in the Self-Learning Mode . . . . . . . . 113
6.5 About Neural Network Primary Optimization Criteria
in the Self-Learning Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
6.6 Optimal Neural Network Models in the Self-Learning Mode
and Arbitrary Teacher Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
7 Analysis of the Open-Loop Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
7.1 Distribution Laws of Analogous and Discrete Neural Network Errors . . . . . . . . . . . 121
7.1.1 Neuron with Two Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
7.1.2 Neuron with a Solution Continuum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
7.1.3 Analysis of a Neuron with Kp Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
7.1.4 Analysis of a Pattern Recognition System
with a Nonlinear Divisional Surface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
7.2 Selection of the Secondary Optimization Functional . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
7.3 About Selection of the Secondary Optimization Functional
in the “Adalin” System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
7.4 Development of the Secondary Optimization Functionals
Corresponding to the Given Primary Optimization Criterion . . . . . . . . . . . . . . . . . . . . 132
7.4.1 The Average Risk Function Minimum Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
7.4.2 Minimum Criterion for R under the Condition p1l1= p2r2 . . . . . . . . . . . . . . . 133
7.4.3 The Minimum Criterion for R under the Condition p1r1= a=Const. . . 134
7.5 Neural Network Continuum Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
7.5.1 Neural Network with a Solution Continuum –
Two Pattern Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
7.5.2 Neural Network with a Solution Continuum –
Continuum of Pattern Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
7.5.3 Neural Network with Kp Solutions –
K Pattern Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
7.5.4 Neural Network with N* Output Channels –
K0 Gradations in Each Class . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
7.5.5 Neural Network with N* Output Channels –
Neural Network Solution Continuum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
7.6 Neural Network in the Self-Learning Mode and Arbitrary Teacher Qualification . 140
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
8 Development of Multivariable Function Extremum Search Algorithms . . . . . . 143
8.1 Procedure of the Secondary Optimization Functional Extremum Search
in Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
8.2 Analysis of the Iteration Method for the Multivariable Function
Extremum Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
8.3 About the Stochastic Approximation Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
8.4 Iteration Methods for Multivariable Function Extremum Search
in the Case of Equality-Type Constraints upon Variables . . . . . . . . . . . . . . . . . . . . . . . . . . 146
8.4.1 Search Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
8.4.2 Analysis of the Matrix of the Second Derivatives
of the Lagrange Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
8.4.3 Operation Speed Optimization for the Extremum Search
Iteration Procedure in the Case of Equality-Type Constraints . . . . . . . . . . . 148
8.4.4 Optimal Operation Speed under Constraints (8.6) . . . . . . . . . . . . . . . . . . . . . . . . 149
8.4.5 The Case of Constraints of Equality Type That Can Be Solved . . . . . . . . . . . 149
8.4.6 Iteration Process Stability under Equality-Type Constraints . . . . . . . . . . . . . 150
8.4.7 Convergence of the Iteration Search Method
under the Equality-Type Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
8.5 Iteration Extremum Search Methods for Multivariable Functions
under Inequality-Type Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
8.5.1 Conditions of Optimality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
8.5.2 Algorithm of Extremum Search in the Case of Inequality-Type
Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
8.6 Algorithm of Random Search of Local and Global Extrema
for Multivariable Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
8.7 Development of the Neural Network Adaptation Algorithms with the Use of
Estimations of the Second Order Derivatives of the Secondary Optimization
Functional . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
8.7.1 Development of Search Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
8.7.2 One-Dimensional Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Part III · Adaptive Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
9 Neural Network Adjustment Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
9.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
9.2 Neuron with Two-Solution Continuums . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
9.3 Two-Layer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
9.4 Multilayer Neural Networks with Solution Continuum Neurons . . . . . . . . . . . . . . . . . 169
9.5 Design of Neural Networks with Closed Cycle Adjustment
under Constraints upon Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
9.6 Implementation of Primary Optimization Criteria for Neurons
with Two Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
9.7 Implementation of Minimum Average Risk Function Criterion
for Neurons with Continuum Solutions and Kp Solutions . . . . . . . . . . . . . . . . . . . . . . . . . 175
9.8 Implementation of the Minimum Average Risk Function Criterion
for Neural Networks with N* Output Channels (Neuron Layer) . . . . . . . . . . . . . . . . . . 177
9.9 Implementation of the Minimum Average Risk Function Criterion
for Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
9.10 Development of Closed-Loop Neural Networks of Non-Stationary Patterns . . . 180
9.11 Development of Closed-Cycle Adjustable Neural Networks
with Cross and Backward Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
9.12 Development of Closed-Loop Neural Networks in the Learning Modes
with Arbitrary Teacher Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
9.13 Expressions for the Estimations of the Second Order Derivatives
of the Secondary Optimization Functional . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
10 Adjustment of Continuum Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
10.1 Adjustment of a Neuron with a Feature Continuum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
10.2 Adjustment of the Continuum Neuron Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
10.3 Selection of the Parameter Matrix for the Learning Procedure
of the Continuum Neuron Layer on the Basis of the Random Sample Data . . . . . 190
10.4 Selection of the Parameter Matrix K*(i,j) for the Learning Procedure
of the Neuron with a Feature Continuum on the Basis of the Random
Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
10.5 Characteristic Properties of the Two-Layer Continuum Neural Network
Adjustment Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
10.6 Three Variants of Implementation of the Continuum Neuron Layer
Weighting Functions and Corresponding Learning Procedures . . . . . . . . . . . . . . . . . . 195
10.7 Learning Algorithm with a2g Secondary Optimization Functional
(the Five-Feature Space) for the Two-Layer Continuum Neural Network . . . . . . . 198
10.7.1 Learning Algorithm for the Second Layer
(Feature Continuum Neuron) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
10.7.2 Learning Algorithm for the First Layer
(Continuum Neuron Layer) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
10.8 Continuum Neuron Layer with Piecewise Constant Weighting Functions . . . . . . 200
10.8.1 Open-Loop Layer Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
10.8.2 Recurrent Adjustment Procedure for the Piecewise Constant
Weighting Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
10.8.3 About Matrix K*(i) Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
10.9 Continuum Neuron Layer with Piecewise Linear Weighting Functions . . . . . . . . . . 202
10.9.1 Open-Loop Structure of the Neuron Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
10.9.2 Recurrent Adjustment Procedure for the Piecewise Linear
Weighting Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
10.10 Continuum Neural Network Layer with Piecewise Constant
Weighting Functions (the Case of Fixed “Footsteps”) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
10.10.1 Open-Loop Layer Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
10.10.2 Recurrent Adjustment Procedure for Piecewise Constant
Weighting Functions with Variable Interval Lengths ts . . . . . . . . . . . . . . . . . . . 205
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
11 Selection of Initial Conditions During Neural Network Adjustment –
Typical Neural Network Input Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
11.1 About Selection Methods for Initial Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
11.2 Algorithm of Deterministic Selection of the Initial Conditions
in the Adjustment Algorithms for Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . 208
11.3 Selection of Initial Conditions in Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . 212
11.4 Initial Condition Formation for Neural Network Coefficient Setting
in Different Problems of Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
11.4.1 Linear Equality Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
11.4.2 Linear Inequality Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
11.4.3 Approximation and Extrapolation of Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
11.4.4 Pattern Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
11.4.5 Clusterization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
11.4.6 Traveling Salesman Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
11.4.7 Dynamic System Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
11.4.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
11.5 Typical Input Signal of Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222
12 Analysis of Closed-Loop Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
12.1 Problem Statement for the Synthesis of the Multilayer Neural Networks
Adjusted in the Closed Cycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
12.2 Investigation of the Neuron Under the Multi-Modal Distribution
of the Input Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
12.2.1 One-Dimensional Case – Search Adjustment Algorithm . . . . . . . . . . . . . . . . . 224
12.2.2 Multidimensional Case – Analytical Adjustment Algorithm . . . . . . . . . . . . . 226
12.3 Investigation of Dynamics for the Neural Networks of Particular Form
for the Non-Stationary Pattern Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
12.4 Dynamics of the Three-Layer Neural Network in the Learning Mode . . . . . . . . . . . 235
12.5 Investigation of the Particular Neural Network
with Backward Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
12.6 Dynamics of One-Layer Neural Networks in the Learning Mode . . . . . . . . . . . . . . . . 242
12.6.1 Neural Network with the Search
of the Distribution Mode Centers f(x) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
12.6.2 Neural Network with N* Output Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
12.6.3 Neuron with Kp Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
12.7 Two-Layer Neural Network in the Self-Learning Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
12.8 About Some Engineering Methods for the Selection of Matrix Parameters
in the Multilayer Neural Network Closed Cycle Adjustment Algorithms . . . . . . . . 257
12.9 Design of the Multilayer Neural Network for the Matrix Inversion Problem . . . 258
12.10 Design of the Multilayer Neural Network for the Number Transformation
from the Binary System into the Decimal One . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
12.11 Investigation of the Multilayer Neural Network
under the Arbitrary Teacher Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
12.12 Analytical Methods of Investigations of the Neural Network
Closed Cycle Adjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
13 Synthesis of Multilayer Neural Networks with Flexible Structure . . . . . . . . . . . . . . 273
13.1 Sequential Learning Algorithm for the First Neuron Layer
of the Multilayer Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273
13.2 Learning Algorithm for the First Neuron Layer of the Multilayer
Neural Network Using the Method of Random Search of Local
and Global Function Extrema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
13.3 Analysis of Algorithm Convergence under the Hyperplane
Number Increase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
13.4 Learning Algorithms for the Second Layer Neurons
of the Two-Layer Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
13.4.1 Condition of the Logical Function e(y) Realizability
Using One Neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
13.4.2 Synthesis of a Neuron by the Functional Minimization Method . . . . . . . . . 285
13.4.3 Neuron Synthesis by the Threshold Function Tables . . . . . . . . . . . . . . . . . . . . . . 290
13.5 Learning Algorithm for Neurons of the Second and Third Layers
in the Three-Layer Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290
13.6 General Methods of the Multilayer Neural Network Successive Synthesis . . . . . . . 292
13.7 Learning Method for the First-Layer Neurons of a Multilayer
Neural Network with a Feature Continuum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
13.8 Application of the Adjustment Algorithm of the Multilayer Neural Networks
with Flexible Structure for the Problem of Initial Condition Selection . . . . . . . . . . 293
13.9 About the Self-Learning Algorithm for Multilayer Neural Networks
with Flexible Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
14 Informative Feature Selection in Multilayer Neural Networks . . . . . . . . . . . . . . . . . . 295
14.1 Statement of the Informative Feature Selection Problem
in the Learning Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
14.2 About Structural Methods for the Informative Feature Selection
in the Multilayer Neural Networks with Fixed Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 297
14.3 Selection of the Initial Space Informative Features Using Multilayer Neural
Networks with Sequential Algorithms of the First-Layer Neuron Adjustment . . 299
14.4 Neuron Number Minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
14.5 About the Informative Feature Selection for Multilayer Neural Networks
in the Self-Learning Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Part IV · Neural Network Reliability and Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
15 Neural Network Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
15.1 Methods for the Neural Network Functional Reliability Investigation . . . . . . . . . . . 305
15.2 Investigation of Functional Reliability of Restoring Organs Implemented
in the Form of Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306
15.3 Investigation of Multilayer Neural Network’s Functional Reliability . . . . . . . . . . . . . 308
15.4 Investigation of the Neural Network’s Parametrical Reliability . . . . . . . . . . . . . . . . . . . 309
15.5 Investigation of the Multilayer Neural Network’s Functional Reliability
in the Case of Catastrophic Failures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
16 Neural Network Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
16.1 Neural Network State Graph – The Main Notions and Definitions . . . . . . . . . . . . . . . 322
16.2 Algorithm of Failure Localization in the Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . 323
16.3 Algorithm of the Minimum Test Design for the Failures
of the Logical Constant Type at the Neuron Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
16.4 Method of the Neural Network Adaptive Failure Diagnostics . . . . . . . . . . . . . . . . . . . . 332
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338
Part V · Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339
17 Methods of Problem Solving in the Neural Network Logical Basis . . . . . . . . . . . . 341
17.1 Neuromathematics – A New Perspective Part of Computational Mathematics . 341
17.2 Neural Network Theory – A Logical Basis for the Development
of the Neural Network Problem Solution Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
17.3 Selection of the Problems Adequate to the Neural Network Logical Basis . . . . . . . 344
17.4 The General Structure of the Program Package for Problem Solution
in the Neural Network Logical Basis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
17.5 Multilayer Neural Networks with Flexible Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350
17.6 Neural Network with Fixed Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
17.6.1 Generation of the Input Signal of the Neural Network . . . . . . . . . . . . . . . . . . . . 352
17.6.2 The Multilayer Neural Network Output Signal Generation . . . . . . . . . . . . . . . 355
17.6.3 Formation of the Primary Optimization Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . 355
17.6.4 Selection of the Open Neural Network Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 356
17.6.5 Remarks about the Selection of the Open Neural Network Structure
that is Adequate to the Class of Solution Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
17.6.6 Remarks about the Activation Function Selection . . . . . . . . . . . . . . . . . . . . . . . . . 358
17.6.7 Selection of the Multilayer Neural Network Structure
According to its Hardware Implementation Technology . . . . . . . . . . . . . . . . . 359
17.6.8 Generation of the Secondary Optimization Functional
in the Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 360
17.6.9 Generation of the Algorithm of the Search Procedure
for the Secondary Optimization Functional Extremum . . . . . . . . . . . . . . . . . . 360
17.6.10 Formation of the Adaptation Algorithms
in the Multilayer Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
17.7 Verification of the Adjusted Multilayer Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
17.8 Elaboration of the Plan of Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365
17.9 About the Importance of the Unification of Designations in the Process
of Synthesis of the Neural Network Adjustment Algorithms . . . . . . . . . . . . . . . . . . . . . . 367
17.10 About Myths in Neural Network Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
17.11 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 376
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379
Author’s Publications on Neural Network Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391

Download this book click here

No comments:

Post a Comment

Related Posts with Thumbnails

Put Your Ads Here!