Research Article

Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

Algorithm 2

Proposed PD-LFDA method.
Input: HSI data samples , the objective dimension to be embedded ,
  the nearest neighbor parameter (default: ), and the test sample
Output: Transformation matrix
  Steps are as follows:
    (1) % Initialize matrices
   (2) ; % within-class scatter
   (3) ; % between-class scatter
   (4)
   (5) % Compute within-class affinity matrix
   (6) for    do % in a classwise manner
   (7)   ; % the class data samples
   (8)   % sample matrix
   (9)  ;
 (10)
  (11)   % Determine the local scaling
 (12)   for    do
 (13)     the nearest neighbor of , ;
 (14)    for    do
 (15)      ;
 (16)    end for
 (17)    ;
 (18)    
 (19)   end for
(20)
 (21)   % Define local affinity matrix
(22)   for    do
(23)    ; % prior probability
(24)    
(25)   end for
(26)  
(27) end for
(28) ; % in block diagonal manner
(29) ; % also in block diagonal manner
(30) for    do
 (31)   ;
(32) end for
(33)
(34) % Compute between-class affinity matrix
(35) ;
(36) % Let denotes the nonzero flag of elements in : if ; if
(37) ;
(38) ;
(39) ;
(40) ;
 (41) for    do
(42)   ;
(43) end for
(44) % Now construct Laplacian matrix for within affinity matrix and between affinity matrix
(45) Let
(46) , ;
(47) Then:
(48) , ;
(49) Construct two matrixs below:
(50) , ;
 (51) Let be the general eigenvector of:
(52) , 
(53) with the corresponding eigenvalue in descending order:             
(54)
(55) Finally, the transformation matrix can be represented as:
(56);
(57)
(58) For a new test sample , the embedding is given by:
(59)