Skip to main content

Table 4 The ranges of hyperparameters used for optimization

From: Prediction of 12-month recurrence of pancreatic cancer using machine learning and prognostic factors

Algorithms

Ranges of hyperparameters

ANNs

Number of neurons in hidden layer = [5–20], learning rate = [0.1–0.7], maximum epoch = [30–100], validation threshold = [50–80].

Bagging

Number of iterations = [8–20], Max-depth = [5–15], Max-features = [1–10], Min_sample_leaf= [1–10], Min_sample-split = [2–50],

J-48

Confidence factor = [0.15–0.3], Min-num-object = [1–3].

NB

Search algorithm = [K2 algorithm, genetic search, repeated hillClimber]

K-NN

K=[3,5,7,9]

LR

Binominal procedure = [Enter, Forward, and Backward], Maximum iterations = [10–25], Confidence Interval = [0.8–0.95].

LightGBM

Num-leaves = [30–70], Max-depth = [6–15], Min-data-in-leaf = [1–5], Number of iterations = [10–20], Learning-Rate = [0.1–0.2], Bagging fractions = [0.5–0.7], Reg-alpha = [0.5–10], Reg-alpha = [0.1–10].

RF

Max_depth = [6–10], number of randomly chosen variables = [5–12], number of iterations = [10–30], Min-sample-split = [1,2,3,5,10], Min-sample-leaf = [1,2,3,5,10], Max_features = [5–15], Criterion=[gini, entropy]

SVM

Tolerance parameters = [0.001–0.01], C = [10–30]

XG-Boost

Eta = [0.1-1], Max-depth = [6–15], classifier = [Rep-tree-Random-tree, J-48], number of epoch = [10–30]