Research Article
[Retracted] Application of Artificial Intelligence Combined with 5G Technology in the Reform of English Teaching in Universities
Algorithm 1
Random forest classification algorithm
(i) | Input: | (ii) | S: the trained set of information | (iii) | B: characteristic set | (iv) | Z: characteristic set | (v) | l: Number of trees | (vi) | Output: Random Forest R | (vii) | Technique: | (viii) | For i = 1 to l do | (ix) | From the training information group S, create a bootstrap specimen in-of-bag data group IOBi and out-of-bag data group OOBi; | (x) | ti(IOBi) = generateTree(IOBi); | (xi) | formula (12) is used to compute the OOBRi of the ti(IOBi) using the OOBi; | (xii) | End for | (xiii) | Arrange all l in order of their OOBR; | (xiv) | Choose the best 85 percent of trees with strong OOBR scores and merge the 85 percentage l into a better random forest R; | (xv) | generateTree() | (xvi) | Make the latest node; | (xvii) | If the halting criteria are fulfilled, then | (xviii) | Returnas a leaf node; | (xix) | Else | (xx) | For j = 1 to N do | (xxi) | Calculate the parameter cor(Bj, Z) by equation (10); | (xxii) | End for | (xxiii) | Estimateby formula (11); | (xxiv) | Utilize the feature weighting technique; | (xxv) | Characteristics (c) are utilized as options for generating the optimum partitioning division for the node; | (xxvi) | For each partition, perform generateTree (); | (xxvii) | End if | (xxviii) | Return G; |
|