Research Article

KoRASA: Pipeline Optimization for Open-Source Korean Natural Language Understanding Framework Based on Deep Learning

Table 2

Parameter comparison between DIET-Base and DIET-Opt (KoRASA).

ParameterDIET-BaseDIET-Opt (KoRASA)

Epoch300500
The number of transformer layers24
Transformer size256256
Masked language modelTrueTrue
Drop rate0.250.25
Weight sparsity0.80.7
Embedding dimension2030
Hidden layer size(256, 128)(512, 128)