Research Article

An Application of a Three-Stage XGBoost-Based Model to Sales Forecasting of a Cross-Border E-Commerce Enterprise

Table 1

The description of parameters in the XGBoost model.

Type of parametersParametersDescription of parametersMain purpose

Booster parametersMax depthMaximum depth of a treeIncreasing this value will make the model more complex and more likely to be overfit
Min_child_weightMinimum sum of weights in a childThe larger the min_child_weight is, the more conservative the algorithm will be
Max delta stepMaximum delta stepIt can help make the update step more conservative
GammaMinimum loss reductionThe larger the gamma is, the more conservative the algorithm will be
SubsampleSubsample ratio of the training instancesIt is used in the update to prevent overfitting
Col sample by a treeSubsample ratio of columns for each treeIt is used in the update to prevent overfitting
EtaLearning rateStep size shrinkage used in the update can prevent overfitting

Regularization parametersAlphaRegularization term on weightsIncreasing this value will make the model more conservative
Lambda

Learning task parametersReg: linearLearning objectiveIt is used to specify the learning task and the learning objective

Command line parametersNumber of estimatorsNumber of estimatorsIt is used to specify the number of iterative calculations