Research Article

[Retracted] Software Systems Security Vulnerabilities Management by Exploring the Capabilities of Language Models Using NLP

Algorithm 7

BERT with DCNN model.
Input: security- and nonsecurity-related text with labeling
Process:
(1)Data preprocessing and tokenization to create a BERT layer:
FullTokenizer = bert.bert_tokenization
FullTokenizer
bert_layer = hub.KerasLayer
(“https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/1,”
trainable = False)
(2)For data set creation purposes, padding of the data batches to be done, to bring all the training sequence to a consistent length
(3)Test and train data set batches are created
(4)DCNN model building as per the specifications provided in the tables above
(5)Training the model with the specifications provided in Table 2
(6)DCNN model compilation: DCNN(tf.keras.Model) DCNN = DCNN (vocab_size, emb_dim, nb_filters, FFN_units,nb_classes,
dropout_rate)
(7)Fit the model with training data
(8)Model evaluation with test data
Output:
Accuracy: 97.44%