神经网络课123教案 研究生课件.ppt

神经网络课123教案 研究生课件

2.1 Single-layer Perceptron Network 2.2 Multi-layers Perceptron Network Chapter 2 Perceptron Network 2.1 Single-layer Perceptron Network Input Vector wji weight factor from xi to yj yj (j=1,2,…,m) output A. A Neuron Xp (p=1,2,….,p) , P point in the n dimension space Perceptron Separates P point into two classes B. Two Dimension Space Neuron Sample space Linear equation w1x1+w2x2-?=0 0 1 0 0 (0,1) (1,1) (0,0) (1,0) AND Problem OR Problem 1 0 1 (0,1) (1,1) (0,0) (1,0) C. Learning Algorithm 1. Initial weight wi(0), k=0, select at random 2. Select a group of sample, xp and dp(desired output) Calculate: Assume: xp0=1, w0=-? 3. Adjusting weight wi(k+1)=wi(k)+?(dp-yp) xpi i=1,2,…,n ? Learning rate 4. Select next sample, and k=k+1 If wi(k+1)=wi(k) Then end Else goto 2 End if i=1,2,…,n Compute output Is desired output achieved Stop Adjust weights No Yes Learning Process of an Artificial Neural Network XOR relation is linear impartibility D. Limitation of Simple Perceptron x1 x2 y 0 0 0 0 1 1 1 0 1 1 1 0 0 ? w1 + 0 ? w2 ? ? 0 ? 0 ? w1 + 1 ? w2 ? ? w2 ? 1 ? w1 + 0 ? w2 ? ? w1 ? 1 ? w1 + 1 ? w2 ? ? w1+w2 ? 0 1 1 0 0 0 0 0 1 0 1 0 0 1 1 1 Output pattern Input pattern 2.2 Multi-layers Perceptron Network Output layer Hidden layer Input layer i=1,2,…,nq j=1,2,…,nq-1 q=1,2,…,q XOR Problem 1.design w11(1), w12(1) to get line L1 L1 equation: w11(1) x1(0)+w12(1) x2(0) - ?1(1)=0 p2 1, p1,p3,p4 -1 2. design w21(1), w22(1) to get line L2 p4 -1 p1, p2, p3 1 3. There are 3 point q1,q2,q3 in Fig. (b) Design w1(2), w2(2) to get line L3 Chapter Highlights 3. An artificial neural network can be organized in many different ways, but the major elements are the processing elements, the connections among the processing elements,the inputs,the outputs,and the weights. 1. The goal of artificial intelli

文档评论(0)

1亿VIP精品文档

相关文档