Explainable Ai For Environmental Decision Support: Interpreting Deep Learning Models In Climate Science

Authors

  • Dr.T. Vengatesh Author
  • Kishor Barasu Bhangale Author
  • Ronicca M.S Author
  • Dr.Harikrushna Gantayat Author
  • Dr. R. Viswanathan Author
  • Mihirkumar B.Suthar Author
  • D. Vanathi Author
  • Dr. Shanky Goyal Author

DOI:

https://doi.org/10.64252/h6ng2f46

Keywords:

Explainable AI (XAI), Deep Learning, Climate Science, CNN-LSTM, SHAP, Environmental Decision Support

Abstract

Deep learning (DL) models have demonstrated high accuracy in climate science applications but suffer from "black-box" opacity, hindering their adoption in environmental decision-making. This research bridges this gap by integrating Explainable AI (XAI) techniques with DL models to enhance transparency in climate predictions. Using a hybrid Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) architecture, we forecast regional temperature anomalies and interpret outputs via SHAP (SHapley Additive exPlanations) and LIME (Local Interpretable Model-agnostic Explanations). Our methodology is validated on ERA5 reanalysis data (1980–2025), achieving a prediction RMSE of 0.86°C. XAI analysis reveals that oceanic heat fluxes and atmospheric pressure patterns are critical drivers of anomalies. The framework empowers policymakers with actionable insights, ensuring DL models are both accurate and trustworthy for climate action.

Downloads

Download data is not yet available.

Downloads

Published

2025-07-02

Issue

Section

Articles

How to Cite

Explainable Ai For Environmental Decision Support: Interpreting Deep Learning Models In Climate Science. (2025). International Journal of Environmental Sciences, 1078-1088. https://doi.org/10.64252/h6ng2f46