|Title||Latent Structured Perceptrons for Large-Scale Learning with Hidden Information|
|Affiliation||Peking Univ, Key Lab Computat Linguist, Minist Educ, Beijing 100871, Peoples R China.|
Peking Univ, Sch EECS, Beijing 100871, Peoples R China.
Natl Inst Informat, Chiyoda Ku, Tokyo 1018430, Japan.
Hong Kong Polytech Univ, Dept Comp, Kowloon 999077, Hong Kong, Peoples R China.
|Citation||IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING.2013,25,(9),2063-2075.|
|Abstract||Many real-world data mining problems contain hidden information (e.g., unobservable latent dependencies). We propose a perceptron-style method, latent structured perceptron, for fast discriminative learning of structured classification with hidden information. We also give theoretical analysis and demonstrate good convergence properties of the proposed method. Our method extends the perceptron algorithm for the learning task with hidden information, which can be hardly captured by traditional models. It relies on Viterbi decoding over latent variables, combined with simple additive updates. We perform experiments on one synthetic data set and two real-world structured classification tasks. Compared to conventional nonlatent models (e.g., conditional random fields, structured perceptrons), our method is more accurate on real-world tasks. Compared to existing heavy probabilistic models of latent variables (e.g., latent conditional random fields), our method lowers the training cost significantly (almost one order magnitude faster) yet with comparable or even superior classification accuracy. In addition, experiments demonstrate that the proposed method has good scalability on large-scale problems.|
|Appears in Collections:||计算语言学教育部重点实验室|