济南大学计算智能实验室陈月辉 15499.ppt

  1. 1、本文档共140页,可阅读全部内容。
  2. 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
  3. 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载
  4. 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
济南大学计算智能实验室陈月辉 15499

分布估计算法 思想 遗传算法中的交叉、变异等操作有可能破坏已经优化好的个体。为了避免这种现象,一种新的演化算法 – 分布估计算法应运而生。 分布估计算法中没有交叉和变异。主要用到是好的个体的一种概率模型,以及根据此模型抽样产生下一代。 GA to EDA 基于种群的增强式学习 Population based Incremental Learning (PBIL, Baluja, 1994) Basic PBIL P ? initialize probability vector (each position = 0.5) while (generations++ limit) for each vector i do for each position j do generate Vi(j) according to P(j) end-do evaluate f(Vi) end-do Vmax = max(f(Vi)) update P according to Vmax if random(0,1] Pmutate mutate P end-if end-while Details Population replaced by probability vector P = {p1 , p2 , …, p?} pi : Probability of 1 in the ith bit Generate n individuals Update P using the best individual pi(t+1) = ?xi + pi(t)(1- ?), i = 1,2,…,? Mutate P: pi(t+1) = ?mU[0,1) + pi(t+1)(1- ?m) PBIL Example t = 0, P = {0.5, 0.5, 0.5, 0.5} Generate 5 individuals {1010, 1100, 0100, 0111, 0001} Fitness: {2, 2, 1, 3, 1} Best individual: 0111; ? = 0.1 Update P p1 = 0.5*(1-0.1) = 0.45 p2 = p3 = p4 = 0.1*1 + 0.5*(1-0.1) = 0.55 Some applications Function optimization Job-shop scheduling TSP Bin-packing Knapsack Problem Neural Network weight training 分布估计算法:总的框架 Estimation of Distribution Algorithms do just that! Typically they operate as follows: Step 0: Randomly generate a set of ? individuals (t=0) Step 1: Evaluate the ? individuals While (not done) Step 2: Select ? individuals (where ? ? ?) to be parents Develop a probability distribution/density function, pt, based on the parents Step 3: Create ? offspring using pt Step 4: Evaluate the offspring Step 5: The ? offspring replace the ? parents (t = t + 1) Step 6: Goto While Flowchart What Models to use? Start with Probability vector for binary strings Gaussian distribution Later Dependency tree models (COMIT) Bayesian Network Probability Vector PMBGAs 分布估计算法:概率向量 The EDA is known as the Univariate Marginal Distribution Algorithm Let’s try to solve the following problem f(x) = x2, where -2.0 ? x ? 2.0, Let l = 7, therefore our mapping function will be d(2,-2,7,c) =

文档评论(0)

153****9595 + 关注
实名认证
内容提供者

该用户很懒,什么也没介绍

1亿VIP精品文档

相关文档