New journal publication: 'BioBLP: a modular framework for learning on multimodal biomedical knowledge graphs'
After almost a year of extensive research, BioBLP is published! We explore the combination of multimodal pretrained attribute encoders with Knowledge Graph Embeddings for biomedical Link Prediction!
Congrats to Daniel Daza, Dimitrios Alivanistos, Thom Pijnenburg, Payal Mitra, Michael Cochez and Paul Groth!
Abstract:
BioBLP allows to investigate different ways of incorporating multimodal biomedical data for learning representations in KGs. With a particular implementation, we find that incorporating attribute data does not consistently outperform baselines, but improvements are obtained on a comparatively large subset of entities below a specific node-degree. Our results indicate a potential for improved performance in scientific discovery tasks where understudied areas of the KG would benefit from link prediction methods.