
and MRR values compared to UNL, particularly at
higher values K. RWN’s ability to maintain superior
MAP and MRR scores highlights its capacity to han-
dle sparse and incomplete datasets effectively.
For rm = 60%, RWN continued to outperform
other normalization schemes, achieving an MF of
0.298 at K = 5 and an MAP of 0.292 at K = 20.
SNL achieved an MF of 0.264 and an MAP of 0.281
at K = 20, showing its limitations in sparse scenar-
ios. DSN and UNL exhibited declining performance
as sparsity increased, strengthening the robustness of
RWN in high-sparsity environments.
The loss curves presented in Figure 2 further
validate these findings. The Collaborative Filtering
(CF) and Knowledge Graph (KG) loss curves indicate
faster convergence and stability for RWN compared
to SNL. Specifically, RWN achieved lower overall CF
and KG losses, reflecting its ability to learn more ef-
fectively. Figure 3 compares the runtime efficiency
of SNL and RWN, showing that while RWN required
slightly more computation time, its superior perfor-
mance justifies the trade-off.
In general, the results emphasize the superiority
of Random Walk Normalized Laplacian (RWN) over
Symmetric Normalized Laplacian (SNL) for the li-
brary recommendation task. Although DSN and UNL
provided additional information, their inclusion was
purely for comparative purposes. The ability of RWN
to capture higher-order interactions and deliver supe-
rior ranking performance makes it a preferred choice,
particularly in sparse settings. These findings validate
the design choices of the applied framework and high-
light the critical role of normalization schemes in en-
hancing the quality of recommendations and retrieval
performance.
Figure 3: Runtime Efficiency Comparison between SNL
and RWL.
5 CONCLUSIONS
This study evaluated the impact of the Symmetric
Normalized Laplacian (SNL) and the Random Walk
Normalized Laplacian (RWN) on the performance of
the PyRec recommendation model in data sets with
varying levels of missing library information (rm =
20%, 40%, 60%). The results show that RWN consis-
tently outperforms SNL in all sparsity levels and eval-
uation metrics, particularly in sparse scenarios (rm =
60%), highlighting its robustness and ability to im-
prove ranking quality. Although SNL performs rea-
sonably well on dense datasets (rm = 20%), its effec-
tiveness diminishes with increasing sparsity.
Future work will explore additional normalization
techniques, such as Doubly Stochastic Normaliza-
tion and Unnormalized Laplacian, to further improve
the adaptability of PyRec. Investigating the interac-
tion between normalization schemes, hyperparameter
configurations, and temporal dynamics will also be
prioritized to improve the scalability and generaliz-
ability of the model.
These findings emphasize the critical role of nor-
malization techniques in improving the performance
of graph-based recommendation systems and open
avenues for further advancements in handling sparse
and complex data sets.
REFERENCES
Asif, N. A., Sarker, Y., Chakrabortty, R. K., Ryan, M. J.,
Ahamed, M. H., Saha, D. K., Badal, F. R., Das, S. K.,
Ali, M. F., Moyeen, S. I., et al. (2021). Graph neural
network: A comprehensive review on non-euclidean
space. Ieee Access, 9:60588–60606.
Belkin, M. and Niyogi, P. (2003). Laplacian eigenmaps
for dimensionality reduction and data representation.
Neural computation, 15(6):1373–1396.
Chung, F. R. (1996). Lectures on spectral graph theory.
CBMS Lectures, Fresno, 6(92):17–21.
Chung, F. R. (1997). Spectral graph theory, volume 92.
American Mathematical Soc.
Isinkaye, F. O., Folajimi, Y. O., and Ojokoh, B. A. (2015).
Recommendation systems: Principles, methods and
evaluation. Egyptian informatics journal, 16(3):261–
273.
Kipf, T. N. and Welling, M. (2016). Semi-supervised clas-
sification with graph convolutional networks. arXiv
preprint arXiv:1609.02907.
LeCun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998).
Gradient-based learning applied to document recogni-
tion. Proceedings of the IEEE, 86(11):2278–2324.
Li, B., Quan, H., Wang, J., Liu, P., Cai, H., Miao, Y., Yang,
Y., and Li, L. (2024). Neural library recommendation
by embedding project-library knowledge graph. IEEE
Transactions on Software Engineering.
Li, Q., Han, Z., and Wu, X.-M. (2018). Deeper in-
sights into graph convolutional networks for semi-
supervised learning. In Proceedings of the AAAI con-
ference on artificial intelligence, volume 32.
ENASE 2025 - 20th International Conference on Evaluation of Novel Approaches to Software Engineering
126