Research Article
Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection
1. Function mutation with partial derivative | 2. Pass in: wolf_pos, derivatives, probability | 3. Set probability vector as scaled derivative vector multiplied by probability | 4. Set selected features as probability vector compared against a generated random number vector | 5. Set new_pos as Call XOR of wolf_pos and selected_features | 6. Pass out: new_pos |
|