Research Article
An Application of a Three-Stage XGBoost-Based Model to Sales Forecasting of a Cross-Border E-Commerce Enterprise
Table 1
The description of parameters in the XGBoost model.
| Type of parameters | Parameters | Description of parameters | Main purpose |
| Booster parameters | Max depth | Maximum depth of a tree | Increasing this value will make the model more complex and more likely to be overfit | Min_child_weight | Minimum sum of weights in a child | The larger the min_child_weight is, the more conservative the algorithm will be | Max delta step | Maximum delta step | It can help make the update step more conservative | Gamma | Minimum loss reduction | The larger the gamma is, the more conservative the algorithm will be | Subsample | Subsample ratio of the training instances | It is used in the update to prevent overfitting | Col sample by a tree | Subsample ratio of columns for each tree | It is used in the update to prevent overfitting | Eta | Learning rate | Step size shrinkage used in the update can prevent overfitting |
| Regularization parameters | Alpha | Regularization term on weights | Increasing this value will make the model more conservative | Lambda |
| Learning task parameters | Reg: linear | Learning objective | It is used to specify the learning task and the learning objective |
| Command line parameters | Number of estimators | Number of estimators | It is used to specify the number of iterative calculations |
|
|