Concluding Remarks LtDAHP provides a very efficient way of overcoming both high computation burden of OSL and the uncertainty difficulty in LtRAHP. LtDAHP establishes a new paradigm in which supervised learning problems can be very simply but still effectively solved by preassigning the hidden parameters and solving the bright parameters only, while not sacrificing the generalization capability. Many problems are still open on LtDAHP. Deserve further study. Thank You! * 关于机器学习的若干理论问题 纲 要 线性学习机的万能性理论 基于误差建模的正则化理论 稀疏信息处理的新模型与新理论 A New Learning Paradigm: LtDAHP(Learning through Deterministic Assignment of Hidden Parameters) Zongben Xu (Xi’an Jiaotong University, Xi’an, China) Email: zbxu@ Homepage: A supervised learning problem: difficult or easy? Can a difficult learning problem be solved more simply? Is a linear machine universal? Outline Some Related Concepts LtRAHP: Learning through Random Assignment of Hidden Parameters LtDAHP: Learning through Deterministic Assignment of Hidden Parameters Concluding Remarks Outline Some Related Concepts LtRAHP: Learning through Random Assignment of Hidden Parameters LtDAHP: Learning through Deterministic Assignment of Hidden Parameters Concluding Remarks Supervised Learning: Given a finite number of input/output samples, to find a function f in a machine H that approximates the unknown relation between the input and output spaces. … … Some Related Concepts: Supervised Learning Black box Face Recognition Social Network Stock Index Tracking ERM Machine: FNNs: Hidden Parameter: Determine the hidden predictors (non-linear mechanism). Bright Parameter: Determine how the hidden predictors are linearly combined (linear mechanism) Some Related Concepts: HP vs BP Bright parameter Hidden parameter Bright parameter Hidden parameter Hidden parameter Hidden parameter Bright parameter One-Stage Learning: HPs and BPs are trained simultaneously in on
原创力文档

文档评论(0)