收藏 分享(赏)

模式分类英文课件prch2part3_ding.ppt

上传人:dreamzhangning 文档编号:2345481 上传时间:2018-09-11 格式:PPT 页数:21 大小:532.50KB
下载 相关 举报
模式分类英文课件prch2part3_ding.ppt_第1页
第1页 / 共21页
模式分类英文课件prch2part3_ding.ppt_第2页
第2页 / 共21页
模式分类英文课件prch2part3_ding.ppt_第3页
第3页 / 共21页
模式分类英文课件prch2part3_ding.ppt_第4页
第4页 / 共21页
模式分类英文课件prch2part3_ding.ppt_第5页
第5页 / 共21页
点击查看更多>>
资源描述

1、Pattern Classification Chapter 2(Part 3),0,Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000 with the permission of the authors and the publisher,Pattern Classification Chapter 2(Par

2、t 3),1,Chapter 2 (part 3) Bayesian Decision Theory (Sections 2-6,2-9),Discriminant Functions for the Normal Density Bayes Decision Theory Discrete Features,Pattern Classification Chapter 2(Part 3),2,2.6 Discriminant Functions for the Normal Density,We saw that the minimum error-rate classification c

3、an be achieved by the discriminant functiongi(x) = ln p(x | i) + ln P(i) Case of multivariate normal,Pattern Classification Chapter 2(Part 3),3,Case i = 2.I (I stands for the identity matrix),Pattern Classification Chapter 2(Part 3),4,A classifier that uses linear discriminant functions is called “a

4、 linear machine” The decision surfaces for a linear machine are pieces of hyperplanes defined by: gi(x) = gj(x),Pattern Classification Chapter 2(Part 3),5,Pattern Classification Chapter 2(Part 3),6,The hyperplane separating Ri and Rj always orthogonal to the line linking the means!,Pattern Classific

5、ation Chapter 2(Part 3),7,Pattern Classification Chapter 2(Part 3),8,Pattern Classification Chapter 2(Part 3),9,Case i = (covariance of all classes are identical but arbitrary!) Hyperplane separating Ri and Rj(the hyperplane separating Ri and Rj is generally not orthogonal to the line between the me

6、ans!),Pattern Classification Chapter 2(Part 3),10,Pattern Classification Chapter 2(Part 3),11,Pattern Classification Chapter 2(Part 3),12,Case i = arbitrary The covariance matrices are different for each category(Hyperquadrics which are: hyperplanes, pairs of hyperplanes, hyperspheres, hyperellipsoi

7、ds, hyperparaboloids, hyperboloids),Pattern Classification Chapter 2(Part 3),13,Pattern Classification Chapter 2(Part 3),14,Pattern Classification Chapter 2(Part 3),15,Example R1(3,8),(3,4),(2,6),(4,6); R2(3,0),(3,-4),(1,-2),(5,-2),Pattern Classification Chapter 2(Part 3),16,Pattern Classification C

8、hapter 2(Part 3),17,2.9 Bayes Decision Theory Discrete Features,Components of x are binary or integer valued, x can take only one of m discrete values v1, v2, , vm,Pattern Classification Chapter 2(Part 3),18,Case of independent binary features in 2 category problemLet x = x1, x2, , xd t where each xi is either 0 or 1, with probabilities: pi = P(xi = 1 | 1)qi = P(xi = 1 | 2),Pattern Classification Chapter 2(Part 3),19,The discriminant function in this case is:,Pattern Classification Chapter 2(Part 3),20,Assignment :2.6.25, 2.9.43,

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 高等教育 > 大学课件

本站链接:文库   一言   我酷   合作


客服QQ:2549714901微博号:道客多多官方知乎号:道客多多

经营许可证编号: 粤ICP备2021046453号世界地图

道客多多©版权所有2020-2025营业执照举报