Skip to main content
Fig. 2 | BMC Medical Informatics and Decision Making

Fig. 2

From: A novel higher performance nomogram based on explainable machine learning for predicting mortality risk in stroke patients within 30 days based on clinical features on the first day ICU admission

Fig. 2

Explainable LightGBM results of using the shapely additive explanations (SHAP) in the testing datasets. A: the SHAP feature analysis summary plot of the top 10 variables. The X-axis is for the SHAP value and Y-axis is for feature, ranked in descending order for feature importance. Each dot in the figure is the SHAP value of a patient at specific feature value, and red represents higher feature values for positive influence on death risk, but blue represents the opposite effect. B: the SHAP partial dependency plots (PDPs) for each selected variable. The X-axis is for each feature and the Y-axis is for the SHAP values. SHAP values greater than 0 indicate that the feature at this specific value is a risk factor for death. The cut-off point was the point where the SHAP value was equal to zero

Back to article page