- 1、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。。
- 2、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 3、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
- 4、该文档为VIP文档,如果想要下载,成为VIP会员后,下载免费。
- 5、成为VIP后,下载本文档将扣除1次下载权益。下载后,不支持退款、换文档。如有疑问请联系我们。
- 6、成为VIP后,您将拥有八大权益,权益包括:VIP文档下载权益、阅读免打扰、文档格式转换、高级专利检索、专属身份标志、高级客服、多端互通、版权登记。
- 7、VIP文档为合作方或网友上传,每下载1次, 网站将根据用户上传文档的质量评分、类型等,对文档贡献者给予高额补贴、流量扶持。如果你也想贡献VIP文档。上传文档
查看更多
07信息论_03
信息论 郑伟zhengw@ Last Class Conditional Entropy(条件熵) The expected entropy of Y after we have observed a value x?X, is called the conditional entropy H(Y|X): Noisy Channel Example噪声信道 Another Example of H(X|Y) Take p(X) over {0,…,500} with p = (?,1/1000,…,1/1000) with entropy H(X) = ? + ??log 1000 = 4.983… bits. If we ‘learn’ that x is not 0, then we increase the entropy: p(x|“x≠0”)=(0,1/500,…,1/500) with H(X|“x≠0”)=8.966… bits. We learned information, yet the entropy/uncertainty increased? Think: Not finding your wallet in the likely place. The expected uncertainty (=conditional entropy) goes down:H(X|“x=0?”) = ? H(X| “x=0”) + ? H(X|“x≠0”) = 4.483… bits. Think: Average value of H(X|X’)? 平均意义下,条件作用使熵减小。 Asymmetry of H(X|Y) If the relation between X and Y is asymmetric, then we will have in general: H(X|Y) ≠ H(Y|X). Example of an asymmetric channel. Assume again H(X) = 1 andnote that H(Y) = H((3/4,1/4)) = 0.811… We have H(X|Y) = 0.689… bits. On the other hand H(Y|X) = 0.5 bits. Check with H(X,Y) = 3/2 and the chain rule:H(X,Y) = H(X) + H(Y|X) = H(Y) + H(X|Y) Chain Rule for Entropy For random variables X1,…,Xn we have the Chain rule: Mutual Information For two variables X,Y the mutual information I(X;Y) is the amount of certainty regarding X that we learned after observing Y. Hence I(X;Y) = H(X)–H(X|Y). Note that now X and Y can be interchanged using the chain rule: Think of I(X;Y) as the ‘overlap’ between X and Y. 互信息是对称的! All Together Now Asymmetric Channel About Mutual Information Mutual information is the central notion in information theory. It quantifies how much we learn about X by observing Y. When X and Y are the same we get: I(X;X) = H(X), hence entropy is called ‘self information’. It can be generalized to more than two variables: Expectation of What? Mutual information can be viewed as an expectation: Relative Entropy The relative entropy or Kullback Leibler distance between two distributions p and q is defined by Example KL Distan
您可能关注的文档
最近下载
- 物业安全生产培训PPT课件.pptx VIP
- 1.2掌握广西壮族服饰元素及特点(课件)《广西壮族服饰文化与创意设计》.pptx VIP
- 交流电气装置的接地规范.pdf VIP
- 创伤(救治)理论知识考核试题及答案.pdf VIP
- 47_DLT 584-2017《3kV~110kV电网继电保护装置运行整定规程》.pdf VIP
- 基于FAP启动子的表达载体及心肌纤维化药物筛选方法.pdf VIP
- 建设工程项目管理规范材料.doc VIP
- 中国设施农业的减碳增汇效应分析——基于1828个县域面板数据的实证研究.pdf VIP
- 小学体育与健康沪教版(五四学制)(2024)二年级全一册《第四课 运动场上我最棒》教学设计 .pdf
- GB50270-2010 输送设备安装工程施工及验收规范.docx VIP
文档评论(0)