Research Article

Sensor Type, Axis, and Position-Based Fusion and Feature Selection for Multimodal Human Daily Activity Recognition in Wearable Body Sensor Networks

Table 4

Parameters of each of the adopted classifiers.

ClassifiersParameters

Multilayer Perceptromsklearn.neural_network.MLPClassifier(hidden_layer_sizes = (100,), activation = “relu”, solver = “adam”, alpha = 0.0001, batch_size = “auto,” learning_ate = “constant,” learning_rate_init = 0.001, power_t = 0.5, max_iter = 200, shuffle = True, random_state = None, tol = 0.001, verbose = False, warm_start = False, momentum = 0.9, nesterovs_momentum = True, early_stopping = False, validation_fraction = 0.1, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e ‒ 08, n_iter_no_change = 10, max_fun = 15000)
We tested hidden_layer_sizes = (10, 20, 50, 75, 100) and hidden_layer_sizes = 100 gave best results

Decision tree (DT)sklearn.tree.DecisionTreeClassifier(criterion = “gini,” spliter = “best,” max_depth = None, min_samples_split = 2, min_samples_leaf = 1, min_weight_fraction_leaf = 0.0, max_features = None, random_state = None, max_leaf_nodes = None, Min_impurity_decrease = 0.0, min_impurity_spilt = None, class_weight = None, presort = “deprecated.” ccp_alpha = 0.0)

Random forest (RF)sklearn.ensemble.RandomForestClassifier(n_estimators = 128, criterion = “gini,” max_depth = None, min_samples_spilt = 2, min_samples_leaf = 1, min_weight_fraction_leaf = 0.0, max_features = “auto,” max_leaf_nodes = None, min_impurity_decrease = 0.0, min_impurity_split = None, bootstrap = True, oob_score = False, n_jobs = None, random_state = None, verbose = 0, warm_start = False, class_weight = None, ccp_alpha = 0.0, max_samples = None)
We tested n_estimators = (10, 64, 12, 256), and n_estimators = 256 gave best results. Homwever, there was aslight difference compared to n_estimators = 128. So, we applied 128 to reduce the over all running time

k-Nearest neighbors (k-NN)kNeighborsClassifier(n_neighbors = 5, weights = “uniform,” algorithm = “auto,” leaf_size = 30, p = 2, metric = “minkowski,” metric_params = None, n_jobs = None, kwargs)
We tested n_neighbors = (1, 5, 10, 20), and n_neighbors = 5 gave best results

Support vector machine (SVM)sklearn.svm.SVC(C = 10, kernel = “linear,” degree = 3, gamma = “auto,” coef0 = 0.0, shrinking = True, probability = False, tol = 0.001, cache_size = 200, class_weight = None, verbose = False, max_iter = ‒1, decision_function_shape = “ovr,” break_ties = False, random_state = None
We tested C = 1, 10, 20, 50, 100 and Kernel = “rbf” and “linear,” and the best results was obtained with gamma = “auto,” C = 10, and kernel = “linear”

Naive Bayessklearn.naive_bayes.GaussianNB (priors = None, var_smoothing = 1e ‒ 09)