Skip to main content

Table 4 Hyperparameters for the boosting algorithms

From: Improved liver disease prediction from clinical data through an evaluation of ensemble learning approaches

Algorithm

Hyperparameters

Boosting

XGB

XGBClassifier (learning_rate = 0.1, n_estimators = 1000, max_depth = 5, min_child_weight = 6, ‘reg_alpha’: 60.0, subsample = 0.6, colsample_bytree = 0.8, ‘gamma’: 4.20)

GB

GradientBoostingClassifier(random_state = 45, learning_rate = [0.1, 2, 5], n_estimators = 5000, max_depth = 4, weight = 6, verbose = 1)

LGBM

LightGBMClassifier (boosting_type = ‘lgbm’, random_state = 45, learning_rate = 0.1, n_estimators = 1000, max_depth = 2, min_child_samples = 250, silent = True, n_jobs = 6)

Bagging

BDT

BaggingDecisonClassifier(base_estimator = None, bootstrap = False, bootstrap_features = True, n_estimators = 500, n_jobs = -1, oob_score = False, random_state = 42, verbose = 0)

RF

RandomForestClassifier (n_estimators = 1000, criterion = ‘gini’, max_depth = None, min_samples_split = 2, min_samples_leaf = 1, max_features = 16, bootstrap = True, random_state = 42)

ET

ExtraTreesClassifier (n_estimators = 1000, criterion = ‘gini’, max_depth = 1000, min_samples_split = 10, min_samples_leaf = 2, max_features = 10, bootstrap = 2, random_state = 42)

Voting

LR + DT + SVM

StackingClassifier(estimators = [(‘lr’, LogisticRegression(),dt, DecisionTree(), ‘svm’, SVC(probability = True)], voting = ‘soft’), params = {‘lr__C’: [1.0, 100.0], ‘svm__C’: [2, 3, 4], estimator = eclf, param_grid = params, cv = 2)