A multi-level fusion based decision support system for academic collaborator recommendation

Tribikram Pradhan, Sukomal Pal

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)


In academia, researchers collaborate with their peers to improve the quality of research and thereby enhance academic profiles. However, information overload in big scholarly data poses a challenge in identifying potential researchers for fruitful collaboration. In this article, we introduce a multi-level fusion-based model for collaborator recommendation, DRACoR (Deep learning and Random walk based Academic Collaborator Recommender). DRACoR fuses deep learning and biased random walk model to provide the recommendation for potential collaborators that share similar research interests at the peer level. We run a topic model on abstracts and Doc2Vec on titles on year-wise publications to capture the dynamic research interests of researchers. Author–author cosine similarity is computed from the feature vectors extracted from abstracts and titles and is then used to weigh edges in the author–author graph (AAG). We also aggregate various meta-path features with profile-aware features to bias the random walk behavior. Finally, we employ a random walk with restart(RWR) to recommend top N collaborators where the edge weights are used to bias the random walker's behavior. Extensive experiments on DBLP and hep-th datasets demonstrate the effectiveness of our proposed DRACoR model against various state-of-the-art methods in terms of precision, recall, F1-score, MRR, and nDCG.

Original languageEnglish
Article number105784
JournalKnowledge-Based Systems
Publication statusPublished - 07-06-2020

All Science Journal Classification (ASJC) codes

  • Management Information Systems
  • Software
  • Information Systems and Management
  • Artificial Intelligence


Dive into the research topics of 'A multi-level fusion based decision support system for academic collaborator recommendation'. Together they form a unique fingerprint.

Cite this