Optimal Brain Surgeon Algorithm - CAE Users.DOC

  1. 1、本文档共11页,可阅读全部内容。
  2. 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
  3. 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载
  4. 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
Optimal Brain Surgeon Algorithm ECE 539 Final Project Mark Slosarek 9014829809 Introduction: For this project I am performing the optimal brain surgery algorithm. This algorithm is a pruning algorithm that reduces the connection between layers thus making it more efficient. A pruned network has several advantages. First, it needs smaller storage space. Second, it spends shorter time in calculation. Reducing a neural networks complexity improves the ability of the network to generalize future examples. Work Performed: The Optimal Brain Surgery Algorithm is based on the MLP Feed-forward Model similar to that described in Lecture 9 MLP (I): Feed-forward Model of professor Hu’s notes. Figure 1: A Three Layer Feed-Forward Multi-layer Perceptron In implementing the Optimal Brain Surgeon Algorithm, I implemented six steps as described in Neural Networks: A comprehensive Foundation by Simon Haykin. Train the given multilayer perceptron to minimum mean-square error Use the multilayer perceptron model with two hidden layers and one output neutron (three layer feed-forward multilayer perceptron) to compute the vector where is the input-output mapping realized by the multilayer perceptron with an overall weight vector , and is the input vector. Use the recursion to calculate the inverse Hessian Find the that corresponds to the smallest saliency: where is the element of . If the saliency is much smaller than the mean square, , then delete synaptic weight , and proceed to step 4. Otherwise, go to 5. Update all the synaptic weights in the network by applying the adjustment: Go to step 2. Stop the computation when no more weights can be deleted from the network without a large increase in the mean-square error. To truly implement this solution many different loops had to be implemented. These loops ended up being very cost effective on the computer and so I implemented a weight removal system that applied an algorithm that removed more weights. Although this was more ef

文档评论(0)

sd7f8dgh + 关注
实名认证
内容提供者

该用户很懒,什么也没介绍

1亿VIP精品文档

相关文档