Research Article
Towards Optimization of Boosting Models for Formation Lithology Identification
Table 2
Tuned parameters for boosting models with search range and optimum value.
| Boosting model | Tuned parameters | Search range | Optimum value |
| AdaBoost | Learning rate | 0.1–0.9 | 0.4 | Number of iterations | 50–300 | 200 |
| GTB | Learning rate | 1e − 5–1 | 0.3 | The minimum number of samples obliged at a leaf node | 5–20 | 20 | Maximum depth of the individual tree | 5–20 | 20 | The number of boosting steps | 100–200 | 200 | The minimum number of samples obliged to split an internal node | 10–50 | 25 | Subsample | 0.6–1 | 0.7 |
| XGBoost | Learning rate | 0–0.3 | 0.3 | Minimum loss reduction to split | 0.1–0.5 | 0.2 | L1 regularization term on terms | 1e − 5–1e−2 | 1e − 4 | The minimum number of samples obliged at a leaf node | 5–50 | 20 | Maximum depth of individual tree | 1–9 | 6 | Number of boosting steps | 300–900 | 900 | Ratio of columns when constructing trees | 50–100 | 60 | Subsample | 0.4–1 | 0.7 | Minimum sum of instance weight | 1–10 | 2 |
|
|