间歇训练法在中长跑中的运用Title.pptVIP

  • 3
  • 0
  • 约2.24万字
  • 约 91页
  • 2018-05-27 发布于河南
  • 举报
间歇训练法在中长跑中的运用Title

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 1/5/07 POS Tagging Words often have more than one POS: back The back door = JJ On my back = NN Win the voters back = RB Promised to back the bill = VB The POS tagging problem is to determine the POS tag for a particular instance of a word. These examples from Dekang Lin 1/5/07 How hard is POS tagging? Measuring ambiguity 1/5/07 3 methods for POS tagging Rule-based tagging (ENGTWOL) Stochastic (=Probabilistic) tagging HMM (Hidden Markov Model) tagging Transformation-based tagging Brill tagger 1/5/07 Hidden Markov Model Tagging Using an HMM to do POS tagging Is a special case of Bayesian inference Foundational work in computational linguistics Bledsoe 1959: OCR Mosteller and Wallace 1964: authorship identification It is also related to the “noisy channel” model that we’ll do when we do ASR (speech recognition) 1/5/07 POS tagging as a sequence classification task We are given a sentence (an “observation” or “sequence of observations”) Secretariat is expected to race tomorrow What is the best sequence of tags which corresponds to this sequence of observations? Probabilistic view: Consider all possible sequences of tags Out of this universe of sequences, choose the tag sequence which is most probable given the observation sequence of n words w1…wn. 1/5/07 Getting to HMM We want, out of all sequences of n tags t1…tn the single tag sequence such that P(t1…tn|w1…wn) is highest. Hat ^ means “our estimate of the best one” Argmaxx f(x) means “the x such that f(x) is maximized” 1/5/07 Getting to HMM This equation is guaranteed to give us the best tag sequence But how to make it operational? How to compute this value? Intuition of Bayesian classification: Use Bayes rule to transform into a set of other probabilities that are easier to compute 1/5/07 Using Bayes Rule 1/5/07 Likelihood and prior n 1/5/07 Two kinds of

文档评论(0)

1亿VIP精品文档

相关文档