神经网络课4小结 研究生课件.pptVIP

  • 3
  • 0
  • 约2.73千字
  • 约 4页
  • 2018-01-30 发布于浙江
  • 举报
神经网络课4小结 研究生课件

Chapter Highlights Questions for Review Questions for Discussion * The generalized delta rule, the most common method for training BP networks, is an iterative gradient-descent method that minimizes mean-square error. This technique uses a momentum term to accelerate the training rate. 2. In building an ANN, the builder must make many decisions: * size of training and test data * normalizing input and output data sets * learning algorithms * topology * transfer function to be used * learning rate and momentum coefficient 3. A learning curve provides a good method to visualize a network performance for recall and generalization. A hierarchical neural network has several hidden layers segmented into subnetworks, where the input vectors are divided into groups based on their effects on the output responses. Two main types: * moving-window networks for time-dependent processes * input-compression networks for working with large input- variable sets An autoassociative network correlates an input pattern to itself, and is used for data compression and filtering, and for dimensionality reduction of an input vector. 8. The neural networks are widely used in modeling,simulation,control, operational faults identification, feature categorization, and so on. 7. The internal representation within the hidden layers of a RBF network has a more natural interpretation. 6. Recurrent networks for time-dependent systems combine the feedback and feedforward connections of neural networks, providing a means to use the output responses of the network as additional input variables through recurrent

文档评论(0)

1亿VIP精品文档

相关文档