Skip to main content

Table 2 Configuration of hyperparameters to tune in the developed models

From: An explainable analysis of diabetes mellitus using statistical and artificial intelligence techniques

Technique

Hyperparameter

Options to configure

PSO-FCM

Initial population

50, 100 ,150 , 200, 250, 300

Activation function

Sigmoid, tanh

Inference function

Standard-Kosko, Modified-Kosko, Rescaled

ANN (MLP)

Hidden units

16, 32, 64, 128, 256

Learning rate

0.0001, 0.001, 0.01, 0.05, 0.1, 0.5

Activation function

Tanh, ReLU

Optimizer

Stochastic gradient descent, Adam

Type of learning

Constant, adaptive

SVM

Kernel

Lineal, radial, sigmoid

C

0.0001, 0.001, 0.01, 0.1, 1.0, 10, 100, 1000

Gamma

0.0001, 0.001, 0.01, 0.1, 1, 10, 100, 1000

XGBoost

Predictors choose at random

Random values between 1 and 20

Number of trees

10, 100, 1000

Minimun node size

2, 5, 7, 11, 15, 20, 25, 30

Depth of trees

1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25

Learning rate

Random values between 0 and 1

Minimum loss reduction

Random values between 0 and 0.01

Percentage of sample

Random values between 0 and 1