BioMed Research International

BioMed Research International / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 3969868 |

Meng-Yi Han, Tian-Ao Xie, Jia-Xin Li, Hui-Jin Chen, Xiao-Hui Yang, Xu-Guang Guo, "Evaluation of Lateral-Flow Assay for Rapid Detection of Influenza Virus", BioMed Research International, vol. 2020, Article ID 3969868, 16 pages, 2020.

Evaluation of Lateral-Flow Assay for Rapid Detection of Influenza Virus

Academic Editor: Haruki Komatsu
Received13 May 2020
Accepted11 Aug 2020
Published08 Sep 2020


Background. Influenza virus mainly causes acute respiratory infections in humans. However, the diagnosis of influenza is not accurate based on clinical evidence, as the symptoms of flu are similar to other respiratory virus. The lateral-flow assay is a rapid method to detect influenza virus. But the effectiveness of the technique in detecting flu viruses is unclear. Hence, a meta-analysis would be performed to evaluate the accuracy of LFA in detecting influenza virus. Methods. Relevant literature was searched out in PubMed, Embase, Web of Science, and Cochrane Library databases with the keywords “lateral flow assay” and “flu virus”. By Meta-DiSc software, pooled sensitivity, pooled specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), summary receiver operating characteristic curve (SROC), and area under the curve (AUC) can be calculated. Results. This meta-analysis contains 13 studies and 24 data. The pooled sensitivity and specificity of the influenza virus detected by LFA were 0.84 (95% CI: 0.82-0.86) and 0.97 (95% CI: 0.97-0.98), respectively. The pooled values of PLR, NLR, DOR, and SROC were 32.68 (17.16-62.24), 0.17 (0.13-0.24), 334.07 (144.27-773.53), and 0.9877. No publication bias was found. Conclusions. LFA exhibited high sensitivity and specificity in diagnosing influenza virus. It is a valuable alternative method which can diagnose influenza virus quickly. However, more evidence is required to confirm whether LFA is comparable to traditional methods for detecting the virus.

1. Introduction

Influenza epidemic is a worldwide public health challenge that leads to substantial socioeconomic burden [1]. The World Health Organization (WHO) reported that every year across the globe, about 1 billion people catch the flu, among whom the severe cases reach three to five million, and 290,000 to 650,000 die from respiratory diseases caused by the flu [2].

Seasonal influenza is caused by influenza viruses, and the meteoric spread of this acute respiratory infectious disease poses a threat to people worldwide. Influenza virus belongs to the Orthomyxoviridae family, which has four subtypes: A, B, C, and D [3]. Influenza A and B viruses spreading cause seasonal epidemics [4]. Influenza C viruses were similar to influenza B viruses, which are known to cause relatively mild respiratory disease in humans [5]. Influenza D viruses with the potential for zoonotic and interspecies transmission were discovered last among the Orthomyxoviridae family; its mechanism is still in infancy and is unclear [6]. Therefore, this article mainly discusses A and B subtypes in influenza virus.

Clinical features of influenza patients are similar to those of patients infected with other respiratory viruses such as rhinovirus, respiratory syncytium virus, parainfluenza virus, and adenovirus. It makes the diagnosis of influenza based on clinical grounds alone potentially inaccurate [7]. Consequently, laboratory diagnostic tests are essential for the diagnosis of influenza.

Currently, for the laboratory diagnosis of influenza viruses, real-time reverse transcription-polymerase chain reaction (RT-PCR) and virus culture are examined as the gold standard [8]. However, virus culture results in periods of up to 10 days, reducing its utility for clinical management [9]. Although RT-PCR shows higher sensitivity than virus culture, and results are obtained within 4-6 hours after submitting the specimen. However, the highest cost of specialized equipment and expertise required cause RT-PCR to be rarely used [10, 11]. Meanwhile, the lateral flow assay (LFA) is a rapid diagnostic test which can detect and quantify analytes in biological fluids, and the results will be available within 5–30 min [12]. It is a simple, sensitive, and practical technique that can be used in the absence of laboratory infrastructure and without advanced biological protection equipment [13]. The basic principle is as follows: clinical samples were combined with labeled antibody; the antigen of antibody compounds in a solid substrate by the capillary action of lateral flow, in a visible signal of the reaction zone, and the excess of labeled antibody continue to migrate through the second antibody capture, which leads to the second colour belt; by measuring, comparing, and testing personnel’s qualitative, semiquantitative, and quantitative determination of antigen under test, intuitive results can be obtained within a short time [14].

For a long time, LFA is a widely used technique in clinical practice, on account of its low costs of developing and ease of manufacture [15, 16]. In accordance with the diverse elements of recognition used, LFA can be classified into two categories. One review focuses on “lateral-flow immunoassay” (LFIA), which mentions that antibodies were attached into exclusive recognition function. The other, nucleic acid LFA (NALFA), applies to test PCR products [17].

However, inspectors are only satisfied with simple technical operations and lack the information to reasonably evaluate the clinical value and reliability of test results and the scientific nature of diagnostic test methods. Hence, this meta-analysis is going to assess the accuracy of LFA in detecting influenza virus to systematically review all relevant studies.

2. Materials and Methods

2.1. Study Design

We commenced this research from January 1, 2000, to November 1, 2019. The accuracy of LFA in the identification of influenza virus was systematically evaluated.

2.2. Search Strategy

Four investigators systematically sought the literature from PubMed, Embase, Web of Science, and the Cochrane Library from January 1, 2000, to November 1, 2019. Articles in those databases were filtered out with keywords by “LFA” OR “lateral-flow assays” OR “Lateral flow assay” OR “Lateral flow immunoassay” OR “Lateral flow immunochromatographic assay” AND “Influenza viruses[all synonyms]”. The articles we retrieved are imported into Endnote X9.3.3.

2.3. Adoption Criteria and Screening Guidelines

The adoption criteria were as follows: (1) samples of influenza virus were identified by the research method with LFA as the core technology or gold standard method; (2) sufficient data will be generated to form a contingency table and will be applied to figure out sensitivity, specificity, diagnostic accuracy, and 95% CI, and English literature is required; (3) lateral flow assay is a core method for detecting influenza viruses; (4) the specimens involved in the literature are humans; and (5) specimen capacity was no fewer than forty.

The screening guidelines were as follows: (1) iterated articles; (2) the literature types other than article; (3) the samples to be studied are from species other than humans; and (4) the gold standard for testing virus was not be mentioned.

2.4. Data Extraction

According to adoption criteria and screening guidelines established beforehand, the literature was retrieved by four researchers independently. After the screening, two evaluators extracted data from the final 13 included literatures. The following data were extracted: author, year of publication, countries that conduct experiments, study design, and so on. If any discrepancy arises in the extracted data, it would be settled through negotiation or a third researcher. P-value <0.05 was considered as statistically significant at 95% confidence interval.

2.5. Quality Assessment

QUADAS-2 can be used to review diagnostic accuracy and served as the evaluation criterion for the quality assessment of the research. This evaluation tool included four aspects: patient selection, indicator testing, reference criteria, and process and time [18].

2.6. Data Analysis and Synthesis

Diagnostic OR (DOR), negative LR (NLR), positive LR (PLR), sensitivity, specificity, and corresponding 95% confidence intervals (CIs) were calculated by using Meta-DiSc analysis of the data in the contingency table. The stochastic effect model was conducted to the description of the precision of LFA in diagnosing influenza viruses, and the results were drawn into the forest map. Stata software was used to draw a funnel plot and chiefly show the analysis of distribution deviation.

2.7. Subgroup Meta-Analyses

We performed a subgroup analysis of two possible sources of heterogeneity based on the characteristics of the included studies. Through the relevant literature, we speculated that the difference of the sample source and the gold standard would have a great impact on the detection. The literature was divided into four groups according to different viral sources: nasal swab, nasopharyngeal aspirates, nasopharyngeal swab, and oropharyngeal swab. The literature was divided into three groups according to different gold standards: virus culture, RT-PCR, and both virus culture and RT-PCR. We do the data analysis by using Meta-DiSc.

3. Results

3.1. Search Results

We obtained 204 articles after searching from the mentioned databases. 82 duplicate articles were eliminated. Of the remaining 122 articles, 88 relevant articles were excluded by screening the titles and abstracts according to the inclusion/exclusion criteria, two articles did animal experiment, nine articles were on basic research, two articles were not written in English, two articles had nothing to do with influenza virus, three articles lacked the use of the gold standard such as culture or RT-PCR, and 3 articles had a sample size that was insufficient. Finally, we included 13 articles in the full-text reviewing for meta-analysis [1931]. An additional file shows these in more detail (Figure S1).

3.2. Characteristics of the Included Studies

From these 13 articles, we extracted 24 sets of data to complete tables. In the process of data extraction, the researchers also recorded the feature information of each article, which is summarized in Table 1.

AuthorYearCountryStudy designReference standard methodTest technologyRapid influenza testsAge stageSample typeSample sourceType of influenza virusThe number
of samples

Poehling et al.2002The United StatesProspectiveCulture+PCRLFIAQuickVueChildren (0 to 19 years old)Nasal swabHospitalInfluenza A (H3N2)2331455209
Quach et al.2002CanadaProspectiveCultureLFIAQuickVueChildrenNasopharyngeal aspiratesHospitalInfluenza A and B virus300424311204
Cazacu et al.2003The United StatesProspectiveCultureLFIAQuickVueChildren (3 months to 28 years)Nasal wash specimensHospitalInfluenza A and B virus35638716295
Stripeli et al.2010GreeceProspectiveRT-PCRLFIAQuickVueChildren (6 months to 14 years old)Nasal swabPediatricianInfluenza A virus (H3N2)21727713170
Patel et al.2011GermanyProspectiveRT-PCRNALFAThe rapidSTRIPE assayUnclearNasal swabPatient centersInfluenza A (H1N1)1749241365
Kim et al.2012KoreaProspectiveRT-PCRLFIARDT kitUnclearNasal swabHospitalInfluenza A (H1N1)21575417119
Sun et al.2013ChinaRetrospectiveRT-PCRLFIAIC-SEDUnclearNasopharyngeal swabHospitalInfluenza A (H1N1)6051009
Ge et al.2013ChinaProspectiveCulture+RT-PCRNALFART-LAMP-LFDUnclear65 pharyngeal swabs, 7 sputa, and 8 tracheal aspiratesUnclearInfluenza A (H7N9)80220058
Leonardi et al.2013The United StatesProspectiveRT-PCRLFIAQuickVueChildren 12 and under vs. adultsNasopharyngeal swabUnclearInfluenza A virus1416622449
Leonardi et al.2013The United StatesProspectiveRT-PCRLFIAQuickVueChildren 12 and under vs. adultsNasopharyngeal swabUnclearInfluenza B virus781611150
Leonardi et al.2013The United StatesProspectiveRT-PCRLFIASofiaChildren 12 and under vs. adultsNasopharyngeal swabUnclearInfluenza A virus1417201851
Leonardi et al.2013The United StatesProspectiveRT-PCRLFIASofiaChildren 12 and under vs. adultsNasopharyngeal swabUnclearInfluenza B virus78220551
Zazueta-Garcia et al.2014MexicoRetrospectiveRT-PCRLFIAThe Xpect Flu A&B KitChildrenNasopharyngeal washesHospitalInfluenza A (H1N1)11334102445
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB0 to 77 years oldNasal swabUnclearInfluenza A129741054
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB0 to 77 years oldNasal swabUnclearInfluenza B145900154
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB2 to 77 years oldSelf-blow nasal discharge specimensUnclearInfluenza A125704051
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB2 to 77 years oldSelf-blow nasal discharge specimensUnclearInfluenza B121646051
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB0 to 45 years oldNasopharyngeal aspiratesUnclearInfluenza A142731266
Sakurai et al.2015JapanProspectiveRT-PCRLFIALFIC-AB0 to 45 years oldNasopharyngeal aspiratesUnclearInfluenza B124562066
Ma et al.2018ChinaProspectiveRT-PCRNALFALFD-RPAUnclearSwabs and serumCenter for Disease Control and Prevention (CDC)Influenza A (H7N9)50210029
Zhang et al.2019ChinaProspectiveRT-PCRLFIAHRP-LFIAUnclearNasopharyngeal swabCenters for Disease Control and Prevention (CDC)Influenza A6285526565
Zhang et al.2019ChinaProspectiveRT-PCRLFIAHRP-LFIAUnclearNasopharyngeal swabCenters for Disease Control and Prevention (CDC)Influenza B39071015292
Zhang et al.2019ChinaProspectiveRT-PCRLFIAHRP-LFIAUnclearOropharyngeal swabCenters for Disease Control and Prevention (CDC)Influenza A64745023579
Zhang et al.2019ChinaProspectiveRT-PCRLFIAHRP-LFIAUnclearOropharyngeal swabCenters for Disease Control and Prevention (CDC)Influenza B70545232626

3.3. Meta-Analyzed Publications’ QUADAS-2 Results

In order to better evaluate the level of articles included in the analysis, the four researchers used a unified assessment scale—the Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2)—as a standard. Table 2 shows consequences of the quality assessment of the 13 included literatures.


Poehling et al.2002YYYUCUCYUCYNYN
Quach et al.2002YYYYUCYYYYYN
Cazacu et al.2003YYYUCUCYUCYYYY
Stripeli et al.2010YYYYUCYUCYYYN
Patel et al.2011YUCYUCYYUCYYYY
Sun et al.2013YNYNUCYYYYYY
Leonardi et al.2013YNYNUCYYYYYY
Zazueta-Garcia et al.2014YYYNUCYYYYYY
Sakurai et al.2015YYYUCUCYUCYYYY
Zhang et al.2019YYYNUCYYYYYY

3.4. Publication Bias

Deeks’ funnel plot symmetry test was performed for the evaluation of publication bias in the included studies [32]. As shown in the funnel plots (Figure 1), most of the points are symmetrically distributed. Moreover, the value of Deeks’ test was 0.822 (), indicating that there was no publication bias.

3.5. The Analysis of Threshold Effect

The Spearman correlation coefficient was 0.148 (<0.6) and the value was 0.489 () according to analyses. We also analyzed the SROC curve (Figure 2), which showed no “shoulder-arm” distribution. It was concluded that there were no threshold effects in the included studies.

3.6. SROC Curve

To assess the accuracy of LFA in diagnosing influenza viruses, we developed a SROC curve. As showed in Figure 2, , and the (). Therefore, we can infer that LFA has a high accuracy in the diagnosis of influenza virus.

3.7. Merge Analysis Results

The analysis value was obtained by analyzing the 13 articles that were finally included. The results are as follows (the results are shown in Figures 3, 4, 5, 6, and 7): the sensitivity was 0.84 (95% CI (0.82, 0.86)), specificity was 0.97 (95% CI (0.97, 0.98)), positive likelihood ratio was 32.68 (95% CI (17.16, 62.24)), negative likelihood ratio was 0.17 (95% CI (0.13, 0.24)), and diagnostic odds ratio was 334.07 (95% CI (144.27, 773.53)).

3.8. Influenza Typing Analysis Results

In analyzing influenza virus A and influenza virus B separately, the results are shown in Figure 8: the sensitivity and specificity of influenza virus A were 0.85 (95% CI (0.82, 0.87)) and 0.98 (95% CI (0.97, 0.99)), respectively (Figures 8(a) and 8 (b)). The results of influenza virus B were 0.85 (95% CI (0.81, 0.88)) and 0.99 (95% CI (0.98, 1.00)), respectively (Figures 8(c) and 8(d)).

3.9. LFA Typing Analysis Results

According to the different substances detected, LFA can be divided into LFIA and NALFA. The value was analyzed according to the classification. The results were as follows (Figure 9): sensitivity and specificity of LFIA were 0.83 (95% CI (0.81, 0.85)) and 0.97 (95% CI (0.97, 0.98)), respectively (Figures 9(a) and 9(b)). The results of NALFA were 0.91 (95% CI (0.85, 0.95)) and 0.97 (95% CI (0.94, 0.99)), respectively (Figures 9(c) and 9(d)).

3.10. Heterogeneity Analysis

A forest map is drawn using a random pattern. As is shown in Figure 7, diagnostic ratios for each study compared with the combined ratios are not along the same line. Moreover, a rough guide to quantitative indicators of heterogeneity by inconsistency index was interpreted as follows: 0–40%: low heterogeneity; 30–60%: moderate heterogeneity; 50–90%: significant heterogeneity; and 75–100%: considerable heterogeneity [33]. In our study, the following values can be obtained: , , and the (); this means that considerable heterogeneity existed in the nonthreshold effect. High heterogeneity was also detected across studies in other testings: sensitivity (, ), specificity (, ), PLR (, ), and NLR (, ).

3.11. Subgroup Meta-Analyses

The subgroup meta-analyses are summarized in Table 3.

Subgroup analysisNumber of studiesSensitivity (95% CI)Specificity (95% CI)

Group A
 Nasal swab60.88 (0.85-0.91)90.5%0.97 (0.95-0.98)15.8%
 Nasopharyngeal aspirates30.93 (0.88-0.96)90.6%0.88 (0.84-0.91)91.7%
 Nasopharyngeal swab70.82 (0.78-0.85)83.0%1.00 (0.99-1.00)42.1%
 Oropharyngeal swab20.62 (0.54-0.70)0.0%1.00 (0.99-1.00)61.8%

Group B
 Culture20.75 (0.65-0.83)10.9%0.91 (0.88-0.93)97.5%
 PCR200.85 (0.83-0.86)92.2%0.99 (0.98-0.99)83.6%
 Culture+PCR20.88 (0.74-0.96)88.2%0.98 (0.96-0.99)58.7%

CI: confidence interval.

For group A, test samples from different sources were used as subgroup analysis criteria, and the results were as follows:

Nasal swab: sensitivity was 0.88 (; ), and specificity was 0.97 (; ).

Nasopharyngeal aspirates: sensitivity was 0.93 (; ), and specificity was 0.88 (; ).

Nasopharyngealswab: sensitivity was 0.82 (; ), and specificity was 1.00 (; ).

Oropharyngeal swab: sensitivity was 0.62 (; ), and specificity was 1.00 (; ).

For group B, different gold standard methods were used as criteria for subgroup analysis, and the results were as follows:

Viral culture: sensitivity was 0.75 (; ), and specificity was 0.91 (; ).

RT-PCR: sensitivity was 0.85 (; ), and specificity was 0.99 (; ).

Viral culture and RT-PCR: sensitivity was 0.88 (; ), and the pooled specificity was 0.98 (; ).

4. Discussion

This study focused on evaluating the value of LFA in the diagnosis of influenza virus. After implementing certain screening criteria, we included a total of 24 data for analysis. The ultimate outcome of quality evaluation exhibited that the sensitivity and specificity of LFA in the identification of influenza virus were 0.84 and 0.97, respectively. The PLR, NLR, and DOR were 32.68, 0.17, and 334.07, respectively. The SROC AUC was 0.9877 (close to 1), indicating the high sensitivity and specificity of LFA in the identification of influenza viruses.

Subsequently, we used Stata software to make the Deeks funnel plot. When , it can be understood that no publication bias was found in the study [34]. The value of the funnel plot is 0.822 (), so we took it to mean that no publication bias existed in our study. By drawing the SROC curve for each diagnostic approach, the heterogeneity caused by the threshold effect was probed into assessing if the points on the curve have a curve (shoulder-arm) pattern. The typical “shoulder-arm” pattern indicates a threshold effect [35]. However, when we analyzed the SROC curve for our study, we found that it had no “shoulder-arm” distribution. On the other hand, when the Spearman correlation coefficient is less than 0.6, the threshold effect is considered absent. In this study, the Spearman correlation coefficient was 0.148 (<0.6) and the value was 0.489 (), which indicate that the included study had no threshold effect.

Furthermore, subgroup analyses were conducted to investigate heterogeneity in sensitivity and in specificity. The results of the subgroup analysis of the sample type indicated differences in the identification capabilities of sampling location. The results showed that the overall heterogeneity of nasopharynx aspirates was higher than that of the other three types. The values of nasopharyngeal aspirates detected were 90.6%, respectively, suggesting high heterogeneity. In a comparison of the two sets of data with culture or RT-PCR as the gold standard, the results suggested that the specificity and sensitivity of both the RT-PCR and culture were higher than those of the group of only one gold standard. The reduction of sensitivity and specificity indicated that only culture or RT-PCR as a gold standard may lead to FP and FN results. Moreover, culture should not be regarded as a single gold standard. The analysis of culture in group B decreased significantly () indicating that reference criteria may not be a source of heterogeneity.

In addition to the above two sources of heterogeneity, we still consider some other possible sources of heterogeneity. For influenza virus samples, different laboratories have different processing methods such as different environments during specimen transportation and different concentrations of influenza virus in the collected samples, which will have a certain impact on the experimental results. Therefore, good sample handling can minimize the impact of environmental factors on virus activity. Generally speaking, the influenza virus should be stored in a virus preservation solution at a low temperature after collection until use, and repeated freezing and thawing should be avoided in this process [36]. The thermal stability of the virus will decrease with the increase of temperature. Repeated freezing and thawing and high temperature will reduce the stability of influenza virus RNA and accelerate the degradation of influenza virus RNA, thereby affecting the test results [37]. Thus, specimens should be submitted for inspection as soon as possible after collection. They should be submitted for inspection within 30 minutes at room temperature and within 2 to 4 hours at 4°C. Specimens that are too late to be processed should not be stored at 4°C for more than 48 hours. If possible, delivery is delayed for 24 hours, and specimens should be stored at <-70°C [38]. Due to the different ages of the tested patients, the sensitivity of LFA is also slightly different. Some articles speculate that influenza viruses are easier to isolate and detect in older patients [21]. Regarding the technology itself, LFA mainly relies on immune recognition, nucleic acid hybridization, and antibody labeling technology, in which the label is one of the key factors that affect its sensitivity [12]. The literature included in this meta-analysis shows that there are many types of markers used in different laboratories, such as biotin, luciferin, colloidal gold, superparamagnetic nanoparticles, and horseradish peroxidase, that affect the positive rate of LFA test results.

There is no doubt that RT-PCR or cell culture has higher accuracy of detecting influenza virus [11]. The accuracy of RT-PCR to detect influenza virus is slightly higher than that of culture [39]. The sensitivity and specificity of LFA in general show 0.85 and 0.99 compared to RT-PCR in our research. However, compared with cell culture, the sensitivity and specificity of LFA are 0.75 and 0.91. We found that using RT-PCR as the gold standard improves the accuracy of LFA detection. This may be the reason why RT-PCR has become more common as the gold standard for influenza virus detection in recent years.

Compared with using RT-PCR or culture, the sensitivity of using LFA to detect influenza virus of four sample types is nasopharyngeal aspirate (93%) > nasal swab (88%) > nasopharyngeal swab (82%) > oropharyngeal swab (62%), and specificity, nasopharyngeal swab = oropharyngeal swab (100%) > nasal swab (97%) > nasopharyngeal aspiration (88%), relatively speaking. Nasopharyngeal aspirates have a higher positive detection rate, and nasopharyngeal aspirate is more suitable for detecting respiratory viruses than throat swab [36]. Therefore, nasopharyngeal aspirate may be more suitable for LFA detection.

We analyzed the results of the subgroup analysis of the 13 included articles, from which we found that nasopharyngeal aspirates had the highest sensitivity in the four categories of appeal samples. This may be due to the fact that nasopharyngeal aspirates have a higher viral load than pharyngeal swabs in respiratory infection virus specimens, which makes nasopharyngeal aspirates easier to detect, and other researchers have shown in experiments that nasopharyngeal aspirates have a higher sensitivity than pharyngeal swabs [36, 40]. In addition, both nasopharyngeal aspirates and pharyngeal swabs belong to upper respiratory tract specimens. Compared with upper respiratory tract specimens, airway aspirates, alveolar lavage fluid, and other lower respiratory tract specimens have better sensitivity, but they cannot be widely used due to the difficulties in the collection process [41]. At the same time, there are more literature showing that although nasopharyngeal aspirates are more sensitive than pharyngeal swabs, the improved detection sensitivity of nasopharyngeal swabs is no less than that of nasopharyngeal swabs. Moreover, pharyngeal swabs are more popular than nasopharyngeal swabs due to their convenience and speed of collection [42]. Therefore, nasopharyngeal aspirates are superior to pharyngeal swabs in terms of sensitivity alone, but the practicality of pharyngeal swabs is greater when combined.

Virus type also had an effect on the accuracy of LFA. In the 4 articles included, it is mentioned that the sensitivity of LFA to detect influenza A virus is more effectively than that of influenza B virus. And the sample specimens of them were generally nasopharyngeal swabs and nasopharyngeal aspirates [21, 25, 27, 31]. However, in our study, the sensitivity of LFA to detect influenza viruses A and B was not significantly different. Further analysis found that in one of the included literatures, LFA was more sensitive in detecting influenza virus B than influenza virus A in nasal swabs [29]. Therefore, we infer that collecting nasal swab samples may enhance the sensitivity of detecting influenza virus B.

Furthermore, we analyzed the results of two different types of LFA tests. It was found that the sensitivity of NALFA to detect influenza virus is higher than that of LFIA, and there is no obvious difference between the specificities. The core of NALFA is nucleic acid hybridization, which captures and detects nucleic acid amplification products similar to lateral-flow immunoassays [43]. The combination of NALFA and amplification sample preparation technology, such as the loop-mediated isothermal amplification (LAMP) method, recombinase polymerase amplification (RPA), and rapid amplification/hybridization reaction, might make up for the lack of qualitative or semiquantitative LFA, improving its accuracy in rapid detection. In addition, the sensitivity of NALFA depends to a certain extent on the virus concentration of respiratory samples, and a higher virus concentration can produce a rapid positive result [23]. The virus concentration in respiratory samples is related not only to the type of virus and the organs or systems involved but also to host factors such as the patient’s age and immune function status [44]. And the amount of virus secretion in the body varies with the course of the patient’s disease and the location of the sample [45]. Therefore, the variability of sample sources will have a certain impact on the sensitivity of NALFA and LFIA test results.

In the literature we have included, LFIA is divided into the classic LFIA method and the improved LFIA method. Most of the principles of the classic LFIA methods and the improved LFIA methods are antigen-antibody reactions. The main difference lies in the different labels, which have a certain impact on the sensitivity of detection results. However, we have not yet retrieved the literature to compare and evaluate the performance of the classic LFIA method and the improved LFIA method, so we are unable to determine whether the improved LFIA method is more sensitive.

Our study has the following limitations: first, LFA cannot distinguish between influenza viruses. In addition, it is not clear that the impact of the accuracy of LFA technology in diagnosing influenza virus whether has effect on the age of patients. Because we have not contacted the authors, the age of the sampled patients in many of the included literature is not clear. Therefore, children and adults cannot be clearly separated. Although the overall sensitivity of LFA detection is very high, the results are not robust. Exactly how to improve the stability sensitivity of the detection results in various situations remains to be studied.

In summary, LFA is a fast, affordable, accurate, and thus promising method for detecting influenza viruses and is expected to have greater achievements for the diagnosis of influenza viruses than the current gold standard method.

5. Conclusion

In conclusion, our study demonstrates that LFA has high sensitivity and specificity in the diagnosis of influenza virus. More efforts should be made to define the accuracy of this promising test for diagnosing influenza virus in the future.

Data Availability

All data generated or analyzed during this study are included in this published article and its supplementary information files.

Conflicts of Interest

The authors declare they do not have any conflict of interest to the work submitted.

Authors’ Contributions

XGG initiated the study. TAX did a preliminary assessment. MYH did the literature searches and screening. JXL and HJC did data extraction and quality assessments. All data extraction was verified by MYH and XHY. All authors participated in the writing and revision of the manuscript and approved the final version of the manuscript. Meng-Yi Han and Tian-Ao Xie contributed equally to this work.

Supplementary Materials

Additional file 1. Figure S1: flow chart of the literature review. (Supplementary Materials)


  1. V. J. Lee, Z. J. M. Ho, E. H. Goh et al., “Advances in measuring influenza burden of disease,” Influenza Other Respir Viruses, vol. 12, no. 1, pp. 3–9, 2018. View at: Publisher Site | Google Scholar
  2. World Health Organization, WHO launches new global influenza strategy, March 2020,
  3. S. Su, X. Fu, G. Li, F. Kerlin, and M. Veit, “Novel influenza D virus: epidemiology, pathology, evolution and biological characteristics,” Virulence, vol. 8, no. 8, pp. 1580–1591, 2017. View at: Publisher Site | Google Scholar
  4. J. E. Park and Y. Ryu, “Transmissibility and severity of influenza virus by subtype,” Infection, Genetics and Evolution, vol. 65, pp. 288–292, 2018. View at: Publisher Site | Google Scholar
  5. B. Crescenzo-Chaigne, C. Barbezange, and S. van der Werf, “Non coding extremities of the seven influenza virus type C vRNA segments: effect on transcription and replication by the type C and type A polymerase complexes,” Virology Journal, vol. 5, no. 1, p. 132, 2008. View at: Publisher Site | Google Scholar
  6. K. Asha and B. Kumar, “Emerging influenza D virus threat: what we know so far!,” Journal of Clinical Medicine, vol. 8, no. 2, p. 192, 2019. View at: Publisher Site | Google Scholar
  7. World Health Organization, Influenza (seasonal), March 2020,
  8. M. Gonzalez-Del Vecchio, P. Catalan, V. de Egea et al., “An algorithm to diagnose influenza infection: evaluating the clinical importance and impact on hospital costs of screening with rapid antigen detection tests,” European Journal of Clinical Microbiology & Infectious Diseases, vol. 34, no. 6, pp. 1081–1085, 2015. View at: Publisher Site | Google Scholar
  9. Y. M. Chong, X. H. Tan, P. S. Hooi, L. M. Lee, I. C. Sam, and Y. F. Chan, “Evaluation of rapid influenza diagnostic tests for influenza A and B in the tropics,” Journal of Medical Virology, vol. 91, no. 8, pp. 1562–1565, 2019. View at: Publisher Site | Google Scholar
  10. S. A. Harper, J. S. Bradley, J. A. Englund et al., “Seasonal influenza in adults and children--diagnosis, treatment, chemoprophylaxis, and institutional outbreak management: clinical practice guidelines of the Infectious Diseases Society of America,” Clinical Infectious Diseases, vol. 48, no. 8, pp. 1003–1032, 2009. View at: Publisher Site | Google Scholar
  11. C. Chartrand, M. M. G. Leeflang, J. Minion, T. Brewer, and M. Pai, “Accuracy of rapid influenza diagnostic Tests,” Annals of Internal Medicine, vol. 156, no. 7, pp. 500–511, 2012. View at: Publisher Site | Google Scholar
  12. K. M. Koczula and A. Gallotta, “Lateral flow assays,” Essays in Biochemistry, vol. 60, no. 1, pp. 111–120, 2016. View at: Publisher Site | Google Scholar
  13. M. L. Boisen, D. Oottamasathien, A. B. Jones et al., “Development of prototype filovirus recombinant antigen immunoassays,” J Infect Dis, vol. 212, Supplement 2, pp. S359–S367, 2015. View at: Publisher Site | Google Scholar
  14. N. M. Rodriguez, J. C. Linnes, A. Fan, C. K. Ellenson, N. R. Pollock, and C. M. Klapperich, “Paper-based RNA extraction, in situ isothermal amplification, and lateral flow detection for low-cost, rapid diagnosis of influenza a (H1N1) from clinical specimens,” Analytical Chemistry, vol. 87, no. 15, pp. 7872–7879, 2015. View at: Publisher Site | Google Scholar
  15. B. A. Rohrman, V. Leautaud, E. Molyneux, and R. R. Richards-Kortum, “A lateral flow assay for quantitative detection of amplified HIV-1 RNA,” PLoS One, vol. 7, no. 9, article e45611, 2012. View at: Publisher Site | Google Scholar
  16. H. Kamphee, A. Chaiprasert, T. Prammananan, N. Wiriyachaiporn, A. Kanchanatavee, and T. Dharakul, “Rapid molecular detection of multidrug-resistant tuberculosis by PCR-nucleic acid lateral flow immunoassay,” PLoS One, vol. 10, no. 9, article e0137791, 2015. View at: Publisher Site | Google Scholar
  17. J. T. Connelly, S. R. Nugen, W. Borejsza-Wysocki, R. A. Durst, R. A. Montagna, and A. J. Baeumner, “Human pathogenic Cryptosporidium species bioanalytical detection method with single oocyst detection capability,” Analytical and Bioanalytical Chemistry, vol. 391, no. 2, pp. 487–495, 2008. View at: Publisher Site | Google Scholar
  18. P. F. Whiting, A. W. Rutjes, M. E. Westwood et al., “QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies,” Annals of Internal Medicine, vol. 155, no. 8, pp. 529–536, 2011. View at: Publisher Site | Google Scholar
  19. K. A. Poehling, M. R. Griffin, R. S. Dittus et al., “Bedside diagnosis of influenzavirus infections in hospitalized children,” Pediatrics, vol. 110, no. 1, pp. 83–88, 2002. View at: Publisher Site | Google Scholar
  20. C. Quach, D. Newby, G. Daoust, E. Rubin, and J. McDonald, “QuickVue influenza test for rapid detection of influenza A and B viruses in a pediatric population,” Clinical and Diagnostic Laboratory Immunology, vol. 9, no. 4, pp. 925-926, 2002. View at: Publisher Site | Google Scholar
  21. A. C. Cazacu, J. Greer, M. Taherivand, and G. J. Demmler, “Comparison of lateral-flow immunoassay and enzyme immunoassay with viral culture for rapid detection of influenza virus in nasal wash specimens from children,” Journal of Clinical Microbiology, vol. 41, no. 5, pp. 2132–2134, 2003. View at: Publisher Site | Google Scholar
  22. F. Stripeli, Z. Sakkou, N. Papadopoulos et al., “Performance of rapid influenza testing in hospitalized children,” European Journal of Clinical Microbiology and Infectious Diseases, vol. 29, no. 6, pp. 683–688, 2010. View at: Publisher Site | Google Scholar
  23. P. Patel, E. Graser, S. Robst et al., “rapidSTRIPE H1N1 test for detection of the pandemic swine origin influenza A (H1N1) virus,” Journal of Clinical Microbiology, vol. 49, no. 4, pp. 1591–1593, 2011. View at: Publisher Site | Google Scholar
  24. W. S. Kim, G. C. Lee, J. H. Yoo, H. Y. Kim, Y. P. Yun, and C. K. Chong, “Development and diagnostic application/evaluation of pandemic (H1N1) 2009 influenza virus-specific monoclonal antibodies,” Microbiology and Immunology, vol. 56, no. 6, pp. 372–377, 2012. View at: Publisher Site | Google Scholar
  25. J. Sun, X. Lei, W. Wang et al., “Development and evaluation of a paramagnetic nanoparticle based immunochromatographic strip for specific detection of 2009 H1N1 influenza virus,” Journal of Nanoscience and Nanotechnology, vol. 13, no. 3, pp. 1684–1690, 2013. View at: Publisher Site | Google Scholar
  26. Y. Ge, B. Wu, X. Qi et al., “Rapid and sensitive detection of novel avian-origin influenza A (H7N9) virus by reverse transcription loop-mediated isothermal amplification combined with a lateral-flow device,” PLoS One, vol. 8, no. 8, article e69941, 2013. View at: Publisher Site | Google Scholar
  27. G. P. Leonardi, A. M. Wilson, and A. R. Zuretti, “Comparison of conventional lateral-flow assays and a new fluorescent immunoassay to detect influenza viruses,” Journal of Virological Methods, vol. 189, no. 2, pp. 379–382, 2013. View at: Publisher Site | Google Scholar
  28. R. Zazueta-Garcia, A. Canizalez-Roman, H. Flores-Villasenor, J. Martinez-Garcia, A. Llausas-Vargas, and N. Leon-Sicairos, “Effectiveness of two rapid influenza tests in comparison to reverse transcription-PCR for influenza A diagnosis,” Journal of Infection in Developing Countries, vol. 8, no. 3, pp. 331–338, 2014. View at: Publisher Site | Google Scholar
  29. A. Sakurai, K. Takayama, N. Nomura et al., “Fluorescent immunochromatography for rapid and sensitive typing of seasonal influenza viruses,” PLoS One, vol. 10, no. 2, article e0116715, 2015. View at: Publisher Site | Google Scholar
  30. S. Ma, X. Li, B. Peng et al., “Rapid detection of avian influenza A virus (H7N9) by lateral flow dipstick recombinase polymerase amplification,” Biological & Pharmaceutical Bulletin, vol. 41, no. 12, pp. 1804–1808, 2018. View at: Publisher Site | Google Scholar
  31. J. Zhang, X. Gui, Q. Zheng et al., “An HRP-labeled lateral flow immunoassay for rapid simultaneous detection and differentiation of influenza A and B viruses,” Journal of Medical Virology, vol. 91, no. 3, pp. 503–507, 2019. View at: Publisher Site | Google Scholar
  32. J. J. Deeks, P. Macaskill, and L. Irwig, “The performance of tests of publication bias and other sample size effects in systematic reviews of diagnostic test accuracy was assessed,” Journal of Clinical Epidemiology, vol. 58, no. 9, pp. 882–893, 2005. View at: Publisher Site | Google Scholar
  33. J. J. Deeks, J. P. Higgins, and D. J. Altman, Analysing data and undertaking meta-analyses, April 2020,
  34. Y. Z. Chen, L. C. Sun, Y. H. Wen et al., “Pooled analysis of the Xpert MTB/RIF assay for diagnosing tuberculous meningitis,” Bioscience Reports, vol. 40, no. 1, 2020. View at: Publisher Site | Google Scholar
  35. J. Zamora, V. Abraira, A. Muriel, K. Khan, and A. Coomarasamy, “Meta-DiSc: a software for meta-analysis of test accuracy data,” BMC Medical Research Methodology, vol. 6, no. 1, p. 31, 2006. View at: Publisher Site | Google Scholar
  36. H. Faden, “Comparison of midturbinate flocked-swab specimens with nasopharyngeal aspirates for detection of respiratory viruses in children by the direct fluorescent antibody technique,” Journal of Clinical Microbiology, vol. 48, no. 10, pp. 3742-3743, 2010. View at: Publisher Site | Google Scholar
  37. S. Ryba, J. Tacner, M. Havlickova, and P. Stopka, “Stability of influenza virus as evaluated by integrity of its RNA,” Acta Virologica, vol. 56, no. 2, pp. 125–128, 2012. View at: Publisher Site | Google Scholar
  38. L. Rainen, Collection, transport, preparation, and storage of specimens for molecular methods, 2005, August 2020,
  39. G. Boivin, I. Hardy, and A. Kress, “Evaluation of a rapid optical immunoassay for influenza viruses (FLU OIA test) in comparison with cell culture and reverse transcription-PCR,” Journal of Clinical Microbiology, vol. 39, no. 2, pp. 730–732, 2001. View at: Publisher Site | Google Scholar
  40. M. Landry, S. Cohen, and D. Ferguson, “Impact of sample type on rapid detection of influenza virus a by cytospin-enhanced immunofluorescence and membrane enzyme-linked immunosorbent assay,” Journal of Clinical Microbiology, vol. 38, no. 1, pp. 29–430, 2000. View at: Google Scholar
  41. P. L. Roa, B. Rodríguez-Sánchez, P. Catalán et al., “Diagnosis of influenza in intensive care units: lower respiratory tract samples are better than nose-throat swabs,” American Journal of Respiratory and Critical Care Medicine, vol. 186, no. 9, pp. 929-930, 2012. View at: Publisher Site | Google Scholar
  42. S. B. Lambert, D. M. Whiley, N. T. O'Neill et al., “Comparing nose-throat swabs and nasopharyngeal aspirates collected from children with symptoms for respiratory virus identification using real-time polymerase chain reaction,” Pediatrics, vol. 122, no. 3, pp. e615–e620, 2008. View at: Publisher Site | Google Scholar
  43. J. Singh, S. Sharma, and S. Nara, “Evaluation of gold nanoparticle based lateral flow assays for diagnosis of enterobacteriaceae members in food and water,” Food Chemistry, vol. 170, pp. 470–483, 2015. View at: Publisher Site | Google Scholar
  44. E. E. Walsh, D. R. Peterson, A. E. Kalkanoglu, F. E. H. Lee, and A. R. Falsey, “Viral shedding and immune responses to respiratory syncytial virus infection in older adults,” The Journal of Infectious Diseases, vol. 207, no. 9, pp. 1424–1432, 2013. View at: Publisher Site | Google Scholar
  45. F. Milano, A. P. Campbell, K. A. Guthrie et al., “Human rhinovirus and coronavirus detection among allogeneic hematopoietic stem cell transplantation recipients,” Blood, vol. 115, no. 10, pp. 2088–2094, 2010. View at: Publisher Site | Google Scholar

Copyright © 2020 Meng-Yi Han et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.