- 1、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。。
- 2、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 3、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
NLP自然语言处理—N-gramlanguagemodel课件
* * * * * * * * * * * * * * * * * * * * * * * CS 388: Natural Language Processing:N-Gram Language Models Raymond J. Mooney University of Texas at Austin Language Models Formal grammars (e.g. regular, context free) give a hard “binary” model of the legal sentences in a language. For NLP, a probabilistic model of a language that gives a probability that a string is a member of a language is more useful. To specify a correct probability distribution, the probability of all sentences in a language must sum to 1. Uses of Language Models Speech recognition “I ate a cherry” is a more likely sentence than “Eye eight uh Jerry” OCR Handwriting recognition More probable sentences are more likely correct readings. Machine translation More likely sentences are probably better translations. Generation More likely sentences are probably better NL generations. Context sensitive spelling correction “Their are problems wit this sentence.” Completion Prediction A language model also supports predicting the completion of a sentence. Please turn off your cell _____ Your program does not ______ Predictive text input systems can guess what you are typing and give choices on how to complete it. N-Gram Models Estimate probability of each word given prior context. P(phone | Please turn off your cell) Number of parameters required grows exponentially with the number of words of prior context. An N-gram model uses only N?1 words of prior context. Unigram: P(phone) Bigram: P(phone | cell) Trigram: P(phone | your cell) The Markov assumption is the presumption that the future behavior of a dynamical system only depends on its recent history. In particular, in a kth-order Markov model, the next state only depends on the k most recent states, therefore an N-gram model is a (N?1)-order Markov model. N-Gram Model Formulas Word sequences Chain rule of probability Bigram approximation N-gram approximation Estimating Probabilities N-gram conditional probabilities can be estimated from ra
您可能关注的文档
最近下载
- 信息化系统运维管理规范与实践案例分析.docx VIP
- 2021年全国新高考Ⅰ卷数学真题试卷(含答案).pdf VIP
- BP85956D_CN_DS_Rev.0.91 规格书晶丰明源家电电源.pdf VIP
- 经胸超声心动图检查规范化应用中国专家共识(2024版)解读PPT课件.pptx VIP
- 常用北曲新谱_郑骞.doc VIP
- 1小纸条 高考成语三千 日积月累记练(测试版)001-025 (1).docx
- 《法理学》课件(第五章:法律关系).ppt VIP
- 2025年自习室市场用户付费意愿与自习室服务质量提升策略分析.docx
- 生物安全法的试题及答案.docx VIP
- 《全大学进阶英语综合教程3》Unit-4教案.pdf VIP
文档评论(0)