牛津大学机器学习课件10.pdfVIP

  • 2
  • 0
  • 约3.09千字
  • 约 30页
  • 2018-11-26 发布于浙江
  • 举报
该文档均来自互联网,如果侵犯了您的个人权益,请联系我们将立即删除!

Max-margin learning Nando de Freitas Outline of the lecture Max margin learning is an extremely powerful idea for learning features with auxiliary tasks, and then use these features to solve tasks with few data. The goal of this lecture is for you to learn Transfer, multi-task, and multi-instance learning Harnessing auxiliary tasks to learn features Matchings Preferences Corruption Formulations of multi-task learning Applications: Cross-lingual embeddings Relation learning Question answering Memory networks Embedding discrete objects in metric spaces Code Words Formulae Logical expressions Symbols DNA sequences Idea: learn embeddings (features) in one task and transfer these to solve new tasks [Kotzias, Denil NdF, 2014] Deep Multi-Instance Learning Deep Multi-Instance Learning Auxiliary tasks to learn features that can be transferred to learn tasks with few labels 1. Matchings [From machine learning to machine reasoning, Leon Bottou] Auxiliary tasks to learn features that can be transferred to learn tasks with few labels 2. Corruption [NLP almost from scratch, Ronan Collobert and colleagues] Auxiliary tasks to learn features that can be transferred to learn tasks with few labels 1. Preferences Max-margin formulations Max-margin formulations [A tutorial on energy based learning, Yann LeCun et al] Max-margin formulations Hinge Loss unconstrained formulation: Max-margin formulations Hinge loss layer Example: Bi-lingual word embeddings [Karl Hermann and Phil Blunsom, 2014] Siamese networks (Yann LeCun) Semi-supervised deep learning (Jason Weston et al) Applicati

文档评论(0)

1亿VIP精品文档

相关文档