Research Article
Efficient Deep Learning Models for DGA Domain Detection
Table 6
Overall classification performances of deep learning models.
| DGA | CNN | LSTM_Attention | BiLSTM_Attention | Ensemble | Support | P | R | F1 | P | R | F1 | P | R | F1 | P | R | F1 |
| Non-DGA (alexa) | 0.9664 | 0.9978 | 0.9818 | 0.9827 | 0.9908 | 0.9867 | 0.9840 | 0.9910 | 0.9875 | 0.9869 | 0.9968 | 0.9918 | 60,424 | Banjori | 0.9995 | 1.0000 | 0.9997 | 0.9995 | 1.0000 | 0.9997 | 0.9995 | 1.0000 | 0.9998 | 0.9998 | 1.0000 | 0.9999 | 44,025 | Tinba | 0.8954 | 0.9862 | 0.9386 | 0.9294 | 0.9835 | 0.9557 | 0.9287 | 0.9890 | 0.9579 | 0.9273 | 0.9960 | 0.9604 | 6,720 | Post | 0.9998 | 0.9977 | 0.9988 | 1.0000 | 0.9998 | 0.9999 | 1.0000 | 0.9997 | 0.9998 | 1.0000 | 0.9997 | 0.9998 | 6,492 | Ramnit | 0.7637 | 0.8408 | 0.8004 | 0.8287 | 0.8964 | 0.8612 | 0.8321 | 0.8951 | 0.8625 | 0.8486 | 0.9023 | 0.8747 | 6,369 | Qakbot | 0.7164 | 0.6620 | 0.6882 | 0.7491 | 0.7995 | 0.7735 | 0.7676 | 0.7798 | 0.7737 | 0.7820 | 0.7970 | 0.7894 | 4,015 | Necurs | 0.7421 | 0.6927 | 0.7166 | 0.9310 | 0.7691 | 0.8424 | 0.9472 | 0.7953 | 0.8646 | 0.9483 | 0.8183 | 0.8785 | 3,248 | Murofet | 0.8158 | 0.7685 | 0.7914 | 0.8342 | 0.8286 | 0.8314 | 0.8230 | 0.8325 | 0.8277 | 0.8464 | 0.8209 | 0.8335 | 2,859 | Shiotob/urlzone/bebloh | 0.9817 | 0.8472 | 0.9095 | 0.9723 | 0.9136 | 0.9420 | 0.9858 | 0.9068 | 0.9447 | 0.9918 | 0.9136 | 0.9511 | 1,459 | Simda | 0.9399 | 0.9710 | 0.9552 | 0.9603 | 0.9791 | 0.9696 | 0.9586 | 0.9858 | 0.9720 | 0.9728 | 0.9912 | 0.9819 | 1,481 | Ranbyus | 0.5685 | 0.3792 | 0.4549 | 0.8054 | 0.8346 | 0.8197 | 0.8718 | 0.8323 | 0.8516 | 0.8536 | 0.8542 | 0.8539 | 1,324 | Pykspa | 0.9399 | 0.8992 | 0.9191 | 0.9336 | 0.9778 | 0.9552 | 0.9308 | 0.9899 | 0.9595 | 0.9790 | 0.9889 | 0.9840 | 992 | Dyre | 0.9910 | 1.0000 | 0.9955 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 1.0000 | 769 | Kraken | 0.9843 | 0.7884 | 0.8755 | 0.9410 | 0.8438 | 0.8898 | 0.9329 | 0.8577 | 0.8937 | 0.9443 | 0.8539 | 0.8968 | 794 | Cryptolocker | 0.7212 | 0.1181 | 0.2030 | 0.5411 | 0.3732 | 0.4418 | 0.5152 | 0.4535 | 0.4824 | 0.6347 | 0.3228 | 0.4280 | 635 | Nymaim | 0.4000 | 0.0407 | 0.0740 | 0.6000 | 0.1885 | 0.2868 | 0.5377 | 0.2784 | 0.3669 | 0.6818 | 0.2547 | 0.3708 | 589 | Locky | 1.0000 | 0.0207 | 0.0405 | 0.7933 | 0.2736 | 0.4068 | 0.8696 | 0.3218 | 0.4698 | 0.9774 | 0.2989 | 0.4577 | 435 | Vawtrak | 0.0000 | 0.0000 | 0.0000 | 0.6916 | 0.2334 | 0.3491 | 0.7049 | 0.2713 | 0.3918 | 0.9484 | 0.7539 | 0.8401 | 317 | Shifu | 0.4945 | 0.7904 | 0.6084 | 0.8435 | 0.9651 | 0.9002 | 0.8550 | 0.9782 | 0.9124 | 0.8840 | 0.9651 | 0.9228 | 229 | Ramdo | 0.9846 | 0.9948 | 0.9897 | 0.9847 | 1.0000 | 0.9923 | 0.9847 | 1.0000 | 0.9923 | 0.9897 | 1.0000 | 0.9948 | 193 | P2P | 0.4118 | 0.0707 | 0.1207 | 0.4149 | 0.3939 | 0.4041 | 0.3957 | 0.3737 | 0.3844 | 0.4615 | 0.7273 | 0.5647 | 198 | Macro-average | 0.7770 | 0.6603 | 0.6696 | 0.8446 | 0.7735 | 0.7909 | 0.8488 | 0.7872 | 0.8045 | 0.8885 | 0.8217 | 0.8369 | 143,567 | Weighted-average | 0.9389 | 0.9454 | 0.9384 | 0.9603 | 0.9616 | 0.9597 | 0.9622 | 0.9634 | 0.9618 | 0.9676 | 0.9682 | 0.9666 | 143,567 |
|
|
P : Precision, R : Recall, and F1 : F1-score. |