Research Article
Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection
1. Function mutation with partial derivative | 2. Pass in: wolf_pos, derivatives, probability threshold, a | 3. Set weight as 0.4a + 0.1 | 4. Normalize the partial derivative | 5. Calculate the sigmoid of the normalized partial derivative and set to variable sig | 6. Use probability threshold and weight to choose the feature indices to change | 7. Calculate the new wolf position and pass back |
|