A Novel Blockchain Proof of Validation Scheme Based on Explainable AI for Healthcare Workload

Authors

DOI:

https://doi.org/10.21015/vtcs.v13i1.2122

Abstract

These days, the usage of blockchain with machine learning to optimise data validation in terms of transparency, validity, and immutability has been increasing daily. Therefore, many complex applications, such as healthcare and related disease processes, have recently required the implementation of many remote resources in a transparent form. The blockchain provides real-time security validation based on proof of work validation schemes. To understand the dynamic situation of blockchain, mainly machine learning implemented for the decision and improve the efficiency of the security. However, there are many limitations when using blockchain technology with machine learning. Therefore, to cope with this issue, a novel blockchain proof of validation scheme based on explainable AI for healthcare applications is needed to process the decision of blockchain with machine learning in a more explainable way. We present the blockchain proof of work validation explainable AI (PoWV-XAI) to control the delay, energy, cost and security dynamic issues compared to existing blockchains with machine learning algorithms. The proposed PoWV-XAI algorithm suggested different metaheuristic schemes and supported the explainability of healthcare workload execution on other nodes, such as local and server. Simulation results show that the proposed PoWV-XAI is more explainable, and all decisions, such as processing delay, validation, security, energy, and cost, are explainable compared to existing blockchain methods.

References

R. Kumar, D. Javeed, A. Aljuhani, A. Jolfaei, P. Kumar, and A. N. Islam, "Blockchain-based authentication and explainable AI for securing consumer IoT applications," IEEE Transactions on Consumer Electronics, vol. 70, no. 1, pp. 1145–1154, 2023.

K. Muneer and U. Fatima, "Cryptocurrencies analytics with machine learning and human-centered explainable AI: Enhancing decision-making in dynamic market," International Journal of Computer Applications, vol. 975, p. 8887, 2023.

S. Sachan and X. Liu, "Blockchain-based auditing of legal decisions supported by explainable AI and generative AI tools," Engineering Applications of Artificial Intelligence, vol. 129, p. 107666, 2024.

P. Kumar, D. Javeed, R. Kumar, and A. N. Islam, "Blockchain and explainable AI for enhanced decision making in cyber threat detection," Software: Practice and Experience, vol. 54, no. 8, pp. 1337–1360, 2024.

H. Y. Chen, K. Sharma, C. Sharma, and S. Sharma, "Integrating explainable artificial intelligence and blockchain to smart agriculture: Research prospects for decision making and improved security," Smart Agricultural Technology, vol. 6, p. 100350, 2023.

R. Salama and M. Ragab, "Blockchain with explainable artificial intelligence driven intrusion detection for clustered IoT driven ubiquitous computing system," Preprint, 2023.

Z. Abou El Houda, H. Moudoud, B. Brik, and L. Khoukhi, "Securing federated learning through blockchain and explainable AI for robust intrusion detection in IoT networks," in IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), IEEE, 2023, pp. 1–6.

N. Sharma and P. G. Shambharkar, "Multi-attention DeepCRNN: An efficient and explainable intrusion detection framework for Internet of Medical Things environments," Knowledge and Information Systems, pp. 1–67, 2025.

M. Hasan, M. S. Rahman, H. Janicke, and I. H. Sarker, "Detecting anomalies in blockchain transactions using machine learning classifiers and explainability analysis," Blockchain: Research and Applications, vol. 5, no. 3, p. 100207, 2024.

J. Dutta, H. B. Eldeeb, and T. D. Ho, "Advanced eHealth with explainable AI: Secured by blockchain with AI-empowered block sensitivity for adaptive authentication," in 2024 IEEE 35th International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), IEEE, 2024, pp. 1–7.

S. Brohi and Q.-u.-a. Mastoi, "AI under attack: Metric-driven analysis of cybersecurity threats in deep learning models for healthcare applications," Algorithms, vol. 18, no. 3, p. 157, 2025.

Q. Mastoi, S. Latif, S. Brohi, J. Ahmad, A. Alqhatani, M. Alshehri, A. Al Mazroa, and R. Ullah, "Explainable AI in medical imaging: An interpretable and collaborative federated learning model for brain tumor classification," Frontiers in Oncology, vol. 15, pp. 1535478–1535478, 2025.

A. Lakhan et al., "Metaverse-assisted healthcare body sensor network architecture," in 2024 IEEE 20th International Conference on Body Sensor Networks (BSN), IEEE, 2024, pp. 1–4.

C. M. van Leersum and C. Maathuis, "Human-centred explainable AI decision-making in healthcare," Journal of Responsible Technology, vol. 21, p. 100108, 2025.

Q. Mastoi, S. Latif, S. Brohi, J. Ahmad, A. Alqhatani, M. Alshehri, A. Al Mazroa, and R. Ullah, "Explainable AI in medical imaging: An interpretable and collaborative federated learning model for brain tumor classification," Frontiers in Oncology, vol. 15, pp. 1535478–1535478, 2025.

Downloads

Published

2025-04-22

How to Cite

Memon, M. F., Matlo, M. A., Siddiqui, A. A., Siddiqui, S. A., Mastoi, Q. U. A., & Lakhan, A. (2025). A Novel Blockchain Proof of Validation Scheme Based on Explainable AI for Healthcare Workload. VAWKUM Transactions on Computer Sciences, 13(1), 54–67. https://doi.org/10.21015/vtcs.v13i1.2122