- 3
- 0
- 约 9页
- 2016-12-11 发布于湖北
- 举报
感知器神经网:
clear all
close all
P=[0 0 1 1;0 1 0 1]
T=[0 1 1 1]
net=newp(minmax(P),1,hardlim,learnp)
net=train(net,P,T)
Y=sim(net,P)
plotpv(P,T)
plotpc(net.iw{1,1},net.b{1})性神经网络clear all
close all
P=[1.0 2.1 3 4]
T=[2.0 4.01 5.9 8.0]
lr=maxlinlr(P)
net=newlin(minmax(P),1,0,lr)
net.trainParam.epochs=300
net.trainParam.goal=0.05
net=train(net,P,T)
Y=sim(net,P)
clear all
close all
P=[1.1 2.2 3]
T=[2.1 4.3 5.9]
net=newlind(P,T)
Y=sim(net,P)
x=-4:.1:4
y=tansig(x)
plot(x,y,^-r)Sigmoid
As a reminder:?σ(x)=1/1+exp(?x)
Its derivative:?ddxσ(x)=(1?σ(x))?σ(x)
The problem here is?exp, which quickly goes to infinity, even though the result ofσ?is restricted to the interval?[0,?1]. The solution: The sigmoid can be expressed in terms of?tanh:?σ(x)=1/2(1+tanh(x/2)).
Softmax
Softmax, which is defined as?softmaxi(a)=exp(ai)/∑jexp(aj)?(where?a?is a vector), is a little more complicated. The key here is to expresssoftmax?in terms of the?logsumexp?function:?logsumexp(a)?=?log(∑?iexpai), for which good, non-overflowing implementations are usually available.
Then, we have?softmax(a)?=?exp(a???logsumexp(a)).
As a bonus: The vector of partial derivatives / the gradient of?softmax?is analogous to the sigmoid, i.e??/?aisoftmax(a)=(1?softmaxi(a))?softmaxi(a).
您可能关注的文档
最近下载
- XX 磷酸铁锂(LFP)生产基地数字万用表(FLUKE 17B+)校准报告.docx VIP
- 油水分离器说明书ocean_clean_eb.pdf VIP
- 华能新能源山西分公司风电机组年度定检作业指导书.docx VIP
- (最新全套表格)SL631-2025年水利水电工程单元工程施工质量检验表与验收表.doc VIP
- 2025年教师资格证考试-中学教师资格证音乐(统考)考试近5年真题集锦(频考类试题)带答案.docx
- 2025年江苏城市职业学院单招笔试英语试题库含答案解析.docx VIP
- 电流表电压表功率表及电阻表检定规程试验报告.doc VIP
- 某建筑安装集团公司应急救援预案专项方案.docx VIP
- 移动基站用电安全培训课件.pptx VIP
- (高清版)DG∕TJ 08-202-2020 钻孔灌注桩施工标准.pdf VIP
原创力文档

文档评论(0)