Explainable Deep Learning Models for Early Diagnosis of Cardiovascular Diseases Using Multi-modal Patient Data

Authors

  • Sujit Murumkar Director, Data & AI Practice, Axtria Inc. Author

DOI:

https://doi.org/10.63282/3050-9262.IJAIDSML-V5I4P119

Keywords:

Explainable Artificial Intelligence (XAI), Deep Learning, Convolutional Neural Networks (CNN), Cardiovascular Disease Diagnosis, Multi-modal Patient Data, Medical Decision Support Systems, Healthcare Analytics, Clinical Explainability, Risk Factor Analysis, Early Disease Detection

Abstract

Deep learning and Artificial Intelligence have started showing their effectiveness in different public sectors, including healthcare, due to the swift advancement of digital technologies. Though they are integrated into the patient care activity, they are still challenged by the black-boxing of their working process. Several deep learning measures have been described that enforce early detection of cardiovascular diseases. Objectification of key factors like privacy and security of patients should be taken into consideration before the diagnosis approach so that clinical trust is achieved and healthcare systems become reliable on the automation algorithm. The study has considered the utilisation of a dataset consisting of 50 entries that help in understanding the rise of CVD through gender segmentation. The use of the CNN model has evaluated the fact that explainable deep learning is the epitome of accuracy, which has the capability of diagnosing the condition of cardiovascular disease in patients and assists healthcare professionals with better solutions to treat them. A proper normal distribution was observed through the tests called ANOVA, descriptive and correlation. The factors such as smoking, the lack of physical activity and limitation in education consecutively lead to the rise of CVD

References

[1] M. Amini, F. Zayeri, and M. Salehi, “Trend analysis of cardiovascular disease mortality, incidence, and mortality-to-incidence ratio: results from global burden of disease study 2017,” BMC Public Health, vol. 21, no. 1, Feb. 2021, doi: https://doi.org/10.1186/s12889-021-10429-0.

[2] O. Gaidai, Y. Cao, and S. Loginov, “Global Cardiovascular Diseases Death Rate Prediction,” Current Problems in Cardiology, vol. 48, no. 5, p. 101622, May 2023, doi: https://doi.org/10.1016/j.cpcardiol.2023.101622.

[3] N. A. Baghdadi, S. Mohammed, A. Malki, I. Gad, A. A. Ewis, and El-Sayed Atlam, “Advanced machine learning techniques for cardiovascular disease early detection and diagnosis,” Journal of Big Data, vol. 10, no. 1, Sep. 2023, doi: https://doi.org/10.1186/s40537-023-00817-1.

[4] N. Rishi, “Deep learning in healthcare: Transforming disease diagnosis, personalized treatment, and clinical decision-making through AI-driven innovations,” World Journal of Advanced Research and Reviews, vol. 24, no. 2, pp. 2841–2856, Nov. 2024, doi: https://doi.org/10.30574/wjarr.2024.24.2.3435.

[5] A. Kiseleva, D. Kotzinos, and P. De Hert, “Transparency of AI in Healthcare as a Multilayered System of Accountabilities: between Legal Requirements and Technical Limitations,” Frontiers in Artificial Intelligence, vol. 5, May 2022, doi: https://doi.org/10.3389/frai.2022.879603.

[6] S. S. Roy, C.-H. Hsu, Akash Samaran, R. Goyal, A. Pande, and V. E. Balas, “Vessels Segmentation in Angiograms Using Convolutional Neural Network: A Deep Learning Based Approach,” Computer Modeling in Engineering & Sciences, vol. 136, no. 1, pp. 241–255, Jan. 2023, doi: https://doi.org/10.32604/cmes.2023.019644.

[7] “View of Deep Learning based classification of ECG signals using RNN and LSTM Mechanism,” Jeeemi.org, 2025. https://www.jeeemi.org/index.php/jeeemi/article/view/496/194 (accessed Oct. 23, 2025).

[8] K. K. Patel, Ayush Kanodia, B. Kumar, and R. Gupta, “Multi-Modal Data Fusion Based Cardiac Disease Prediction using Late Fusion and 2D CNN Architectures,” pp. 279–284, Mar. 2024, doi: https://doi.org/10.1109/spin60856.2024.10512195.

[9] Luis Manuel Pereira, A. Salazar, and L. Vergara, “A Comparative Study on Recent Automatic Data Fusion Methods,” Computers, vol. 13, no. 1, pp. 13–13, Dec. 2023, doi: https://doi.org/10.3390/computers13010013.

[10] L. Moss, D. Corsar, M. Shaw, I. Piper, and C. Hawthorne, “Demystifying the Black Box: The Importance of Interpretability of Predictive Models in Neurocritical Care,” Neurocritical Care, May 2022, doi: https://doi.org/10.1007/s12028-022-01504-4.

[11] Jeet Narkhede, “Comparative Evaluation of Post-Hoc Explainability Methods in AI: LIME, SHAP, and Grad-CAM,” pp. 826–830, Oct. 2024, doi: https://doi.org/10.1109/icses63445.2024.10762963.

[12] G. Joshi, R. Walambe, and K. Kotecha, “A Review on Explainability in Multimodal Deep Neural Nets,” IEEE Access, vol. 9, pp. 59800–59821, 2021, doi: https://doi.org/10.1109/access.2021.3070212.

[13] J. G. Serrano, F. Felip-Miralles, and M. Puig-Poch, “Development of a methodological framework for the integration of human factors throughout the design engineering project,” Theoretical Issues in Ergonomics Science, pp. 1–19, Aug. 2024, doi: https://doi.org/10.1080/1463922x.2024.2391311.

[14] M. S. Reed et al., “Evaluating Impact from research: a Methodological Framework,” Research Policy, vol. 50, no. 4, 2021, Accessed: Oct. 20, 2025. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0048733320302225

[15] IBM, “Convolutional Neural Networks,” Ibm.com, Oct. 06, 2021. https://www.ibm.com/think/topics/convolutional-neural-networks (accessed Oct. 20, 2025).

[16] D. Tang, J. Chen, L. Ren, X. Wang, D. Li, and H. Zhang, “Reviewing CAM-Based Deep Explainable Methods in Healthcare,” Applied Sciences, vol. 14, no. 10, p. 4124, May 2024, doi: https://doi.org/10.3390/app14104124.

[17] S. Lu, R. Chen, W. Wei, and M. Belovsky, “Understanding Heart Failure Patients EHR Clinical Features via SHAP Interpretation of Tree-Based Machine Learning Model Predictions,” AMIA ... Annual Symposium proceedings / AMIA, pp. 813–822, 2022.

[18] H. Suri, “Ethical Considerations of Conducting Systematic Reviews in Educational Research,” Systematic Reviews in Educational Research, vol. 1, no. 1, pp. 41–54, 2020, doi: https://doi.org/10.1007/978-3-658-27602-7_3.

[19] Z. Gao, Z. Chen, A. Sun, and X. Deng, “Gender Differences in Cardiovascular Disease,” Medicine in Novel Technology and Devices, vol. 4, no. 100025, p. 100025, Dec. 2019, doi: https://doi.org/10.1016/j.medntd.2019.100025.

[20] B. J. Webber, “Physical Activity for Cardiovascular Risk Factor Reduction,” American Journal of Lifestyle Medicine, Aug. 2025, doi: https://doi.org/10.1177/15598276251363115.

[21] L. Xu, D. Mondal, and D. A. Polya, “Positive Association of Cardiovascular Disease (CVD) with Chronic Exposure to Drinking Water Arsenic (As) at Concentrations below the WHO Provisional Guideline Value: A Systematic Review and Meta-analysis,” International Journal of Environmental Research and Public Health, vol. 17, no. 7, p. 2536, Apr. 2020, doi: https://doi.org/10.3390/ijerph17072536.

[22] R. F. Miller et al., “Explainable Deep Learning Improves Physician Interpretation of Myocardial Perfusion Imaging,” pp. jnumed.121.263686–jnumed.121.263686, May 2022, doi: https://doi.org/10.2967/jnumed.121.263686.

[23] M. Moshawrab, Mehdi Adda, Abdenour Bouzouane, H. Ibrahim, and A. Raad, “Reviewing Multimodal Machine Learning and Its Use in Cardiovascular Diseases Detection,” vol. 12, no. 7, pp. 1558–1558, Mar. 2023, doi: https://doi.org/10.3390/electronics12071558.

Published

2024-12-30

Issue

Section

Articles

How to Cite

1.
Murumkar S. Explainable Deep Learning Models for Early Diagnosis of Cardiovascular Diseases Using Multi-modal Patient Data. IJAIDSML [Internet]. 2024 Dec. 30 [cited 2026 Jan. 23];5(4):206-13. Available from: https://ijaidsml.org/index.php/ijaidsml/article/view/389