An Explainable Artificial Intelligence (XAI) Based Driver Fatigue Detection System For Safe And Care Human Life (SCHL)
DOI:
https://doi.org/10.64252/b1waxa37Keywords:
Driver Fatigue Detection Systems (DFDS), Decision Trees, Safe and Care Human Life (SCHL), Deep Learning, eXplainable Artificial Intelligence (XAI)Abstract
Road safety experts confirm driver fatigue as a major factor that leads to traffic accidents which results in deaths because existing fatigue detection methods operate without clear explanations of their analysis process. This paper investigates the fusion of Explainable Artificial Intelligence (XAI) systems into Driver Fatigue Detection Systems (DFDS) for Safe and Care Human Life (SCHL) purposes to boost safety levels and protect human life value. The use of XAI models enables AI systems to generate transparent explanations regarding their fatigue identification processes and resulting predictions. Decision trees and rule-based models and attention mechanisms and deep learning techniques are examined in this paper regarding their capabilities in predicting model outcomes. This review analyzes the technical capabilities of applied methods regarding their fatigue detection effect while prioritizing transparency standards needed for field implementation. The paper examines how different sensor data streams which include biometrics and vehicles Dynamics and environmental elements enhance detection precision while making the system more resilient. Additionally the review discusses strategies for incorporating XAI into DFDS systems as it examines future approaches to bridge model precision requirements with human readable explanations. This research shows how effective explanation systems serve as a means to boost the reliability while ensuring trustworthiness and enhancing effectiveness of driver fatigue detection systems for safer transportation environments.