ReFactor GNN:从信息传递角度重新审视FMs

主讲人:陈艺虹 | UCL和FAIR联培博⼠生

  • 开课时间

    2022.11.16 20:00

  • 课程时长

    47分钟

  • 学习人数

    3522人次学习

立即学习
添加客服微信了解详情,添加时请备注【GNNs】

立即学习

ReFactor GNN:从信息传递角度重新审视FMs

Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactor GNNs. This new architecture draws upon both modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactor GNNs. Across a multitude of well-established KGC benchmarks, our ReFactor GNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters. 

暂无相关课程