TitleCollaborative Group Learning
AuthorsFeng, Shaoxiong
Chen, Hongshen
Ren, Xuancheng
Ding, Zhuoye
Li, Kan
Sun, Xu
AffiliationBeijing Inst Technol, Sch Comp Sci & Technol, Beijing, Peoples R China
JDcom, Beijing, Peoples R China
Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China
Peking Univ, Ctr Data Sci, Beijing, Peoples R China
Issue Date2021
PublisherTHIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE
AbstractCollaborative learning has successfully applied knowledge transfer to guide a pool of small student networks towards robust local minima However, previous approaches typically struggle with drastically aggravated student homogenization when the number of students rises. In this paper, we propose Collaborative Group Learning, an efficient framework that aims to diversify the feature representation and conduct an effective regularization. Intuitively, similar to the human group study mechanism, we induce students to learn and exchange different parts of course knowledge as collaborative groups. First, each student is established by randomly routing on a modular neural network, which facilitates flexible knowledge communication between students due to random levels of representation sharing and branching. Second, to resist the student homogenization, students first compose diverse feature sets by exploiting the inductive bias from subsets of training data, and then aggregate and distill different complementary knowledge by imitating a random subgroup of students at each time step. Overall, the above mechanisms are beneficial for maximizing the student population to further improve the model generalization without sacrificing computational efficiency. Empirical evaluations on both image and text tasks indicate that our method significantly outperforms various state-of-the-art collaborative approaches whilst enhancing computational efficiency.
URIhttp://hdl.handle.net/20.500.11897/623142
ISBN978-1-57735-866-4
ISSN2159-5399
IndexedCPCI-S(ISTP)
Appears in Collections:信息科学技术学院
计算语言学教育部重点实验室
其他研究院

Files in This Work
There are no files associated with this item.

Web of Science®



Checked on Last Week

百度学术™



Checked on Current Time




License: See PKU IR operational policies.