Research Article
A Fortran-Keras Deep Learning Bridge for Scientific Computing
Algorithm 1
Original code from [
46]. Layer operations occur inside the network module, limiting flexibility.
pure subroutine fwdprop(self, x) | | ! Performs the forward propagation and stores arguments to activation | | ! functions and activations themselves for use in backprop. | | class(network_type), intent(in out): self | | real(rk), intent(in): x() | | integer(ik): n | | associate(layers => self % layers) | | layers(1) % a = x | | do n = 2, size(layers) | | layers(n) % z = matmul(transpose(layers(n-1) % w), layers(n-1) % a) + layers(n) % b | | layers(n) % a = self % layers(n) % activation(layers(n) % z) | | end do | | end associate | | end subroutine fwdprop |
|