Description
The course extends the fundamental tools in “Machine Learning Foundations” to powerful and practical models by three directions, which includes embedding numerous features, combining predictive features, and distilling hidden features. [這門課將先前「機器學習基石」課程中所學的基礎工具往三個方向延伸為強大而實用的工具。這三個方向包括嵌入大量的特徵、融合預測性的特徵、與萃取潛藏的特徵。]
What you will learn
第一講:Linear Support Vector Machine
more robust linear classification solvable with quadratic programming
第二講:Dual Support Vector Machine
another QP form of SVM with valuable geometric messages and almost no dependence on the dimension of transformation
第三講:Kernel Support Vector Machine
kernel as a shortcut to (transform + inner product): allowing a spectrum of models ranging from simple linear ones to infinite dimensional ones with margin control
第四講:Soft-Margin Support Vector Machine
a new primal formulation that allows some penalized margin violations, which is equivalent to a dual formulation with upper-bounded variables