模式识别Chapter 3概要1.pptVIP

  • 17
  • 0
  • 约6.11千字
  • 约 10页
  • 2017-07-23 发布于湖北
  • 举报
模式识别Chapter 3概要1

* Maximum Likelihood Estimation * Example The Gaussian case: Unknown * The Gaussian case: Unknown Example * Example The Gaussian case: Unknown and * Example The Gaussian case: Unknown and * Example The Gaussian case: Unknown and * * * * * * Gaussian Mixture * Bayes Estimation Whereas in maximum-likelihood methods, we view the true parameter vector to be fixed, in Bayesian method, we consider to be a random variable, and the training data allows us to convert a distribution on this variable into a posterior probability density. * 模式识别系统 Gaussian density Gaussian Classifier Estimate mean vector and covariance matrix * Gaussian Classifiers 概率密度函数 分类函数 * 假设独立等方差 Nearest distance (nearest mean) 同时也是线性鉴别函数 (LDF) Gaussian Classifiers * 假设等协方差矩阵 Linear discriminant function (LDF) Gaussian Classifiers * Gaussian Classifiers 假设任意协方差矩阵且等先验概率 Quadratic discriminant function (QDF) Decision surface * Parameter Estimation of Gaussian Density Maximum Likelihood (ML) Gaussian Classifiers * Parameter Estimation of Gaussian Density Gaussian Classifiers * 共享协方差距阵的情况 Gaussian Classifiers * Parametric分类器不好用吗 -- 实际中很多类别的概率分布近似Gaussian -- 即使概率分布偏离Gaussian比较大,当特征维数高而训练样本少(Curse of dimensionality)时,Parametric分类器仍然比较好 有时LDF甚至比QDF更好 ML估计的好处: -- 训练计算量小(与类别数和样本数成线性关系) -- 高维情况下降维(特征选择、变换)经常是有益的 Gaussian Classifiers * Gaussian分类器的改进 QDF的问题 参数太多:与维数的平方成正比 训练样本少时协方差矩阵奇异 即使不奇异ML估计的泛化性能也不好 Regularized discriminant analysis (RDA) 通过平滑协方差矩阵克服奇异,同时提高泛化性能 Gaussian Classifiers We could design an optional classifier if we knew the prior probabilities and conditional densities. One approach is use the samples to estimate the unknown probabilities and densities, and then the resulting estimates as if they were the true values. Chapter 3 1 . 设 为来正态分布 的样本集,试求参数 的最大似然估计量 。 作业四 2 . 设

文档评论(0)

1亿VIP精品文档

相关文档