TY - GEN
T1 - Explainable Deep Learning for PCOS Detection in Ultrasound Images
T2 - 3rd IEEE International Conference on Networks, Multimedia and Information Technology, NMITCON 2025
AU - Bhatnagar, Shaleen
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Polycystic Ovary Syndrome (PCOS) is considered one of the most frequent endocrine disorders among women of childbearing potential and early and precise diagnosis is essential for directed treatment. In this paper, we proposed an interpretable deep learning model for PCOS diagnosis based on ovarian ultrasound images. The model was derived from the MobileNetV2 architecture, and it was trained and validated using an open access dataset of 3,996 images (2,036 PCOS images and 1,960 normal images). The trained model attained a validation accuracy 84% and a class-wise F1-score of 0.79 for PCOS, and a weighted F1-score of 0.67 having suitable performance in the detection of PCOS cases. For improved interpretability and clinician confidence, we used Gradientweighted Class Activation Mapping (Grad-CAM) to generate visual explanations that show the image regions that are most useful for making the model's predictions. These visualizations facilitate the interpretation and validation of automatic diagnostic output by clinicians, allowing them to merge artificial intelligence with medical practice. Conclusion: The findings indicate the high translational value of explainable AI systems for supporting computer-aided diagnosis and second opinions in gynecology for the benefit of clinical decision making and patient care.
AB - Polycystic Ovary Syndrome (PCOS) is considered one of the most frequent endocrine disorders among women of childbearing potential and early and precise diagnosis is essential for directed treatment. In this paper, we proposed an interpretable deep learning model for PCOS diagnosis based on ovarian ultrasound images. The model was derived from the MobileNetV2 architecture, and it was trained and validated using an open access dataset of 3,996 images (2,036 PCOS images and 1,960 normal images). The trained model attained a validation accuracy 84% and a class-wise F1-score of 0.79 for PCOS, and a weighted F1-score of 0.67 having suitable performance in the detection of PCOS cases. For improved interpretability and clinician confidence, we used Gradientweighted Class Activation Mapping (Grad-CAM) to generate visual explanations that show the image regions that are most useful for making the model's predictions. These visualizations facilitate the interpretation and validation of automatic diagnostic output by clinicians, allowing them to merge artificial intelligence with medical practice. Conclusion: The findings indicate the high translational value of explainable AI systems for supporting computer-aided diagnosis and second opinions in gynecology for the benefit of clinical decision making and patient care.
UR - https://www.scopus.com/pages/publications/105020828459
UR - https://www.scopus.com/pages/publications/105020828459#tab=citedBy
U2 - 10.1109/NMITCON65824.2025.11188205
DO - 10.1109/NMITCON65824.2025.11188205
M3 - Conference contribution
AN - SCOPUS:105020828459
T3 - 3rd IEEE International Conference on Networks, Multimedia and Information Technology, NMITCON 2025
BT - 3rd IEEE International Conference on Networks, Multimedia and Information Technology, NMITCON 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 1 August 2025 through 2 August 2025
ER -