Research Article

A Fading Channel Simulator Implementation Based on GPU Computing Techniques

Table 1

Channel emulator implementation comparison. Time consumption for mega samples generation and channel realizations (in milliseconds).

Matlab1CUDA Libs2

Min1640.89537.376
Max1821.17175.186
Mean1760.82146.935

CPU: Intel Core i5 3.4 GHz 16 GB.
2GPU: GeForce GTX 780M 4 GB.