BioMed Research International

BioMed Research International / 2021 / Article
Special Issue

Teledentistry: Current Applications, Trends, Future Scope, and Problems

View this Special Issue

Review Article | Open Access

Volume 2021 |Article ID 9751564 |

Naseer Ahmed, Maria Shakoor Abbasi, Filza Zuberi, Warisha Qamar, Mohamad Syahrizal Bin Halim, Afsheen Maqsood, Mohammad Khursheed Alam, "Artificial Intelligence Techniques: Analysis, Application, and Outcome in Dentistry—A Systematic Review", BioMed Research International, vol. 2021, Article ID 9751564, 15 pages, 2021.

Artificial Intelligence Techniques: Analysis, Application, and Outcome in Dentistry—A Systematic Review

Academic Editor: Vincenzo Grassia
Received06 May 2021
Revised30 May 2021
Accepted05 Jun 2021
Published23 Jun 2021


Objective. The objective of this systematic review was to investigate the quality and outcome of studies into artificial intelligence techniques, analysis, and effect in dentistry. Materials and Methods. Using the MeSH keywords: artificial intelligence (AI), dentistry, AI in dentistry, neural networks and dentistry, machine learning, AI dental imaging, and AI treatment recommendations and dentistry. Two investigators performed an electronic search in 5 databases: PubMed/MEDLINE (National Library of Medicine), Scopus (Elsevier), ScienceDirect databases (Elsevier), Web of Science (Clarivate Analytics), and the Cochrane Collaboration (Wiley). The English language articles reporting on AI in different dental specialties were screened for eligibility. Thirty-two full-text articles were selected and systematically analyzed according to a predefined inclusion criterion. These articles were analyzed as per a specific research question, and the relevant data based on article general characteristics, study and control groups, assessment methods, outcomes, and quality assessment were extracted. Results. The initial search identified 175 articles related to AI in dentistry based on the title and abstracts. The full text of 38 articles was assessed for eligibility to exclude studies not fulfilling the inclusion criteria. Six articles not related to AI in dentistry were excluded. Thirty-two articles were included in the systematic review. It was revealed that AI provides accurate patient management, dental diagnosis, prediction, and decision making. Artificial intelligence appeared as a reliable modality to enhance future implications in the various fields of dentistry, i.e., diagnostic dentistry, patient management, head and neck cancer, restorative dentistry, prosthetic dental sciences, orthodontics, radiology, and periodontics. Conclusion. The included studies describe that AI is a reliable tool to make dental care smooth, better, time-saving, and economical for practitioners. AI benefits them in fulfilling patient demand and expectations. The dentists can use AI to ensure quality treatment, better oral health care outcome, and achieve precision. AI can help to predict failures in clinical scenarios and depict reliable solutions. However, AI is increasing the scope of state-of-the-art models in dentistry but is still under development. Further studies are required to assess the clinical performance of AI techniques in dentistry.

1. Introduction

Artificial intelligence (AI) is a general term which refers to perform the task of human beings with the help of machine and technology. According to “Barr and Feigenbaum,” AI is the part of computer science concerned with designing an intelligent computer system that exhibits characteristics we associate with intelligence in human behavior-understanding language, learning, reasoning, problem solving, and many more [1]. There are subcategories of AI, which is machine learning and its allied fields like deep learning, cognitive computing, natural language processing, robotics, expert systems, and fuzzy logic. Machine learning is a subgroup of AI which enhances automated learning ability without being distinctly programmed. Its primary goal is to allow automated learning without human arbitration. AI models predict future events with the present set of observations [2]. The schematic presentation of AI and a human intelligence model is shown in Figures 1 and 2.

AI, similar to other fields, is transforming as an emerging field of dentistry. AI can perform a number of simple tasks in the dental clinic with greater precision, less staffing, and fewer errors than human counterparts; from booking and coordinating regular appointments to assisting the clinical diagnosis and treatment planning, AI can handle all [3]. The AI application showed high accuracy, sensitivity, specificity, and precision in detection and classification of malocclusion in orthodontics [4]. AI can automatically detect and classify dental restorations on panoramic radiographs along with assistance in the detection of dental and maxillofacial abnormalities such as periodontal diseases, root caries, bony lesions, i.e., BRONJ (bisphosphonate-related osteonecrosis of the jaw) associated with dental extraction, and facial defects [3, 5].

A popular field in machine learning is “deep learning,” where multilayered (deep) neural networks are used to learn hierarchical features in the data. Deep learning refers to the process of data (e.g., images) and corresponding labels (e.g., “carious tooth,” or “specific area on an image where a caries lesion is present”) being repetitively passed through the neural network during training, with the model parameters (so-called weights) being iteratively adjusted to improve the model’s accuracy [1]. A deep learning-based convolutional neural network (CNN) algorithm considerably performed well in detecting dental caries in periapical radiographs [6]. It also successfully helped in detecting and classifying impacted supernumerary teeth in patients with fully erupted maxillary permanent incisors on panoramic radiographs [7]. The fully deep, fine-tuned mask R-CNN model performed well in automated tooth segmentation on panoramic images [8]. Additionally, it was also used for detecting apical lesions on panoramic radiographs [9].

Recently, an investigation showed that artificial neural networks (ANNs) could act as a second opinion to locate the apical foreman on radiographs and to enhance the accuracy of working length determination by radiography [10]. In another in vitro study, ANN also aided in the determination of shade, light-curing unit, and composite Vickers hardness ratio of bottom to top composites [11]. AI technology is found useful in assisting debonding probability of composite restorations in restorative dentistry [12].

Furthermore, an automated robotic system can fulfill the requirements of typical dental operations with accurate, safe, and three-dimensional (3D) tooth preparation [13, 14]. The AI convolutional neural network (CNN) can be utilized for classifying dental arches and designing removable partial dentures [15]. AI can analyze the impact of orthognathic treatment on facial attractiveness and age appearance. It offers a new feature that permits scoring of facial attractiveness and apparent age objectively and reproducibly [16]. Automated integration of facial and intraoral images of anterior teeth benefits dentists to analyze the shape and position of maxillary anterior teeth [17].

In a nutshell, the last decade has seen a surged with breakthrough in advancement of technology associated with artificial intelligence. However, it is still uncertain how information available in the literature regarding AI can assist in diagnosis, planning, and management of dental diseases. Therefore, to understand the current trends of AI in dentistry and its application, a systematic review was carried out on studies which have discussed different modalities of artificial intelligence, its application, and outcome in dentistry.

2. Materials and Methods

2.1. Focused Question

This systematic review was conducted using PRISMA (Preferred Reported Items for Systematic Review and Meta-analysis) guidelines. Our intended question was “Which artificial intelligence techniques are practiced in dentistry, and how AI is improving the diagnosis, clinical decision making, and outcome of dental treatment?” The question was constructed according to the Participants Intervention Comparison Outcome and Study (PICOS) strategy [18].

Population: patient/simulator faciodental images (two-dimensional image (2D), three-dimensional (3D), radiographs (periapical, bitewing, orthopantomography, and cone-beam computed tomography), CAD/CAM (computer-aided design and computer-aided manufacturing). Virtual dental models.

Intervention: AI techniques (deep learning, natural language processing, and robotics) applied in diagnosis, management, and predicting prognosis of dental treatment.

Comparison: automatic algorithm, testing models, image analysis, and rater opinions.

Outcome: analysis of AI performance, accuracy/precision, sensitivity, rating, CDS: clinical decision support, AUC: area under the curve, and AI applicability in different dental specialties.

Study design type: for this review, we considered both observational (case control and cohort) and interventional (trials) based studies, published in the English language.

2.2. Eligibility Criteria

The subsequent articles were reviewed for inclusion criteria: (1) original articles relevant to AI in dentistry, (2) clinical trials, (3) nonclinical trials, (4) observational studies, and (5) English language articles, whereas review articles, letters to editors, commentaries, grey literature, case reports, and articles with less than 10 participants or specimen were excluded.

2.3. Search Methodology

The medical subject heading (MeSH) terms are artificial intelligence (AI), dentistry, AI in dentistry, neural networks and dentistry, machine learning, AI dental imaging, and AI treatment recommendations; electronic search was carried out with PubMed/MEDLINE, ScienceDirect, Scopus, Web of Science, and Cochrane Collaboration databases. The articles published in the years 2000 to 2020 were targeted. The duration of data extraction was between 10 and 12 weeks. The last search was performed in the month of January 2020. Two calibrated reviewers (N.A. and W.Q.) performed the search. Disagreements and discrepancies were resolved by consensus, and a third examiner (F.Z.) was consulted. All the titles and abstracts were read thoroughly from the articles searched primarily, and nonrelevant studies were excluded. The relevant articles were enlisted and scrutinized for any similar studies which matched our inclusion criteria. For extraction of pertinent results, we read full texts of the included studies and the findings were recorded.

2.4. Quality Assessment of Included Studies

Quality assessment of included articles was carried out according to the standard parameters described in the Cochrane Handbook for Systematic Reviews of Interventions (v5.1.0) [19]. The parameters were patient randomization, blinding procedure, withdrawal/dropout reported; statistical analysis was used and stated clearly, execution of sample size estimation, multiple variables measurement, clear inclusion and exclusion criteria, comprehensible examiner reliability tested and clearly report all expected outcomes. The quality of each study was further classified into low, medium, and high risk of bias. The same 2 review authors autonomously sort out the search to amplify the number of studies recovered. The reviewers surveyed every selected article for the predefined consideration criteria and directed impartial appraisals, and any ambiguity was settled by discussion and agreement or by consultation with a third reviewer (F.Z.).

The Newcastle-Ottawa quality assessment scale (NOS) for case-control studies [20] was used for further analysis of the included articles. The analysis was based on the three core quality analysis parameters: case and group (selection, definition, and representativeness), comparability (comparison of case and control groups; analysis and control of confounding variable), and exposure (outcome assessment, i.e., analysis of golden percentage estimation in patients by different examiners; evaluation of study outcome related to different teeth measurements clinically; use of a universal assessment method for both control and case groups; dropout rate of patients in the included studies). A star system was adopted for rating the included studies. Each item in selection and outcome category received a maximum of 01 star while 02 stars were assigned for comparability if sufficiently reported. Each study total scored from 1 to 8 stars. Due to heterogeneity of the outcome and variables in selected studies, the research team was not able to conduct meta-analysis in the current review.

3. Results

3.1. Search Results

The primary search identified 175 articles based on key terms. Following those, 41 duplicates were removed, and 134 articles were screened based on title and abstracts. The search was further narrow down, and 96 irrelevant articles were excluded. The remaining 38 full-text articles were assessed for eligibility. Additionally, 6 full-text articles were further excluded. The 32 relevant articles were finally included and analyzed in the review. The PRISMA flow diagram for the literature search strategy is described in Figure 3. The excluded studies, in addition to their reasons for exclusion, are mentioned in Table 1.

Author and yearReason of exclusion

Gould [21] 2002Disagreement between authors
Van der Meer et al. [22] 2016Not AI, it was related to 3D printing guides
Vera et al. [23] 2013AI-related to dental biotechnology
Leeson [24] 2020Disagreement between authors
Rekow [25] 2020Not AI, it was related to digital dentistry
McCracken et al. [26] 2000Not AI, it was related to computer-assisted learning program

AI: artificial intelligence.
3.2. General Characteristics of Included Studies

The general characteristics of the included studies are summarized in Table 2. The data were extracted from articles about the proposed study design: the authors’ ID, year of publication, study and control groups, area of application in dentistry, assessment methods, follow-up period, and outcome of the study.

Author and yearStudy designGroupsApplicationAssessment methodFollow-up periodOutcome

Abdalla-Aslan et al. [5] 2020Cohort studyMachine learning computer vision algorithmsNAODAutomatic algorithm was used to detection and classification restoration while vector machine algorithm with error-correcting output codes was applied for cross-validationNAMachine learning demonstrated excellent performance in detecting and classifying dental restorations on panoramic images
Bouchahma et al. [6] 2019Clinical trialCNNNMOD and endodonticsPrediction of three types of treatments; fluoride, filling, and root canal treatments. The model was trained to learn on dataset of 200 X-ray images of patients’ teeth collectedNMDL overall accuracy was 87%. The best prediction was the fluoride treatment with 98%, followed by RCT detection 88% and filling 77%
Kuwada et al. [7] 2020Clinical trialDetectNet, AlexNet, and VGG-16NMOD400 images were randomly selected as training data, and 100 as validating and testing data. The remaining 50 images were used as new testing data. Recall, precision, and F-measure were used for detection of impacted teethNMDetectNet and AlexNet appear to have potential use in classifying the presence of impacted supernumerary teeth in the maxillary incisor region on PR, while VGG-16 showed lower values
Lee et al. [8] 2020Clinical trialCNN on 20 automated 20 tooth segmentsOral radiologist manually performed individual tooth annotation on the PAOD and forensic dentistry846 images with tooth annotations from 30 PA were used for training, and 20 as the validation and test sets. A fully deep learning method using the mask R-CNN model was implemented through a fine-tuning process to detect and localize the tooth structuresNMIt achieved high performance for automation of tooth segmentation on dental panoramic images. The proposed method might be applied in the first step of diagnosis automation and in forensic identification
Ekert et al. [9] 2019Clinical trialCNN to detect ALSix independent examiners detect ALEndodontics and ODNN was trained and validated via 10 times repeated group shuffling. Results were compared with the majority vote of 6 examiners who detected ALs on an ordinal scaleNMA moderately deep CNN showed satisfying discriminatory ability to detect ALs on panoramic radiographs
Saghiri et al. [10] 2012Clinical trialANNEndodontist’s opinionEndodonticsWorking length was determined and confirmed radiographically by endodontists and compared with ANN, and stereomicroscope as a gold standard after tooth extraction in cadaverNMANN was more accurate than endodontists’ determinations when compared with measurements by using the stereomicroscope
Arisu et al. [11] 2018Clinical trialANNNMRestorative dentistryObtained measurements and data were fed to an ANN to establish the correlation between the inputs; composite shade curing units and outputs; tooth numberNMANN showed that the light-curing units and composite parameter had the most significant effect on the bottom to top Vickers hardness ratio of the composites
Yamaguchi et al. [12] 2019Clinical trial12 dislodge CAD/CAM composite resin crowns with DL12 trouble-free CAD/CAM composite resin crownsRestorative dentistryConvolution neural network (CNN) technique was used to predict debonding of composite crowns using 2D images captured from 3D stereolithography modelsNMDeep learning with CNN model showed good performance in terms of dislodgement predictability of composite crowns through 3D stereolithography models
Otani et al. [13] 2015Experimental studyTen veneer preparation with a robotic armTen conventional veneers prepared by a clinicianRestorative dentistryAccuracy and precision of veneer preparation were compared for all sites and separately for each tooth surface (facial, finish line, incisal) through 3D images and computationNMThe robotic arm was able to prepare the tooth model as accurately as the control. However, a better finish line accuracy and precision was showed by the robotic arm
Wang et al. [14] 2014Experimental studyAutomatic laser ablation system for tooth crown preparationNMProsthodonticsA layer-by-layer ablation method is developed to control the laser focus during the crown preparationNMThe movement range and the resolution of the robotic system meet the satisfying requirements of typical dental operations for clinical crown preparation
Takahashi et al. [15] 2020Experimental studyCNNNMProsthodontics and OD1184 images of dental arches were classified into four arch types. A CNN method to classify images was developed using tensor flow and Kera’s deep learning librariesNMThe results of this study suggest that dental arches can be classified and predicted using a CNN
Patcas et al. [16] 2019Cohort studyCNN was applied in posttreatment photographs of 146 orthognathic patientsPretreatment photographs of 146 patientsOrthodonticsCNN-based technique was used to compare facial attractiveness and apparent age of patients through pre- and posttreatment photographsNAArtificial intelligence can be used to detect facial attractiveness scores and apparent age in orthognathic surgery patients
Li et al. [17] 2020Clinical trial50 oral images and 274 anterior through automated photo integrating systemManual segmentation systemEsthetic dentistryThe facial and intraoral key points were detected by an automatic algorithm and compared with manual segmentation on standard photographsNMThe proposed automated system can eliminate the need for dentists to employ a laborious image integration process and has potential for broad applicability in the field of esthetic dentistry
Li et al. [44] 2015Experimental studyBPNN and GA neural networkTraditional neural networkEsthetic dentistryThe weighs and threshold values of GA and BPNN were compared for assistance in tooth color matching in dentistryNMGA and BP have practical application and can make teeth color matching objective and accurate
Edinger [30] 2004Clinical trialROSY, a robot-like electronic simulatorNMProsthodonticsAccuracy of the simulator was measured for all directions in space by registering eccentric jaw positions on both sides of 10 subjectsNMIts accuracy may render it suitable for clinical applications
Meissner et al. [31] 2006Clinical trialAutomated smart ultrasonic calculus detection systemNMPeriodonticsThe detection device is based on a conventional dental piezoelectric ultrasonic hand piece with a conventional scaler insertNMIt was able to distinguish between different tooth surfaces in vitro independently from tip movements
Meissner et al. [32] 2005Clinical trialA novel calculus recognition device applied on 70 extracted teethNMPeriodonticsImpulse generator, coupled to a conventional piezo-driven ultrasonic scaler, sends signals to the cementum via the tip of an ultrasound deviceNMThis system is able to function correctly, independent of the lateral forces and the tip angle of the instrument
Devito et al. [33] 2008Clinical trialMultilayer perceptron neural networkTwenty-five dental specialists with 20 years’ experienceODEvaluation of proximal caries on radiographic through ANNNMAI improves the radiographic diagnosis of proximal caries by 39.4%
Kositbowornchai et al. [34] 2006Clinical trialLearning vector quantization (LQV, NN)NMRestorative dentistry and ODTooth sections and microscopic examinations were used to confirm the actual dental caries statusNMAI plays a useful and supporting in making dental caries diagnosis
Patcas et al. [35] 2019Clinical trialTen images evaluated by CNN modelTen images were analyzed by laypeople, orthodontists, and oral surgeon on a visual analogue scaleOrthodonticsDecision on profile and frontal images of cleft patients were compared between CNN technique and conventional rater group to evaluate facial attractivenessNMAI can be a helpful tool to describe facial attractiveness and overall analysis were comparable with the rater groups
Lee et al. [36] 2018Clinical trialCNNFour calibrated board-certified dentistsOD and restorative dentistryA pretrained GoogleNet Inception v3 CNN network was used for preprocessing and transfer learningNMCNN provides considerably good performance in detecting dental caries in PR
Vranckx et al. [37] 2020Clinical trialCNN and ResNet-101Manual measurements by 2 observersODCNN and ResNet-101 jointly predicted the molar segmentation maps and an estimate of the orientation linesNMFast, accurate, and consistent automated measurement of molar angulations on dental PR
Lee et al. [38] 2020Clinical trialFifty cases of class2 TMJOAFifty cases of normal TMJOMFSThe condylar head was classified into 2 categories and tested by making 300 imagesNMAI can be used to support clinicians with diagnosis and decision making for treatments of TMJOA
Hung et al. [39] 2019Clinical studyMachine learning method ANN was used on bitewing radiographTraining group consisting of conventional radiograph analysisGerontologySupport vector machine (ANN) was used to detect root caries on radiograph by determining AUCNMSupport vector machine showed 97.1% accuracy, 95.1% precision, 99.6% sensitivity, and 94.3% specificity for root caries detection
Cui et al. [27] 2020Cohort studyCDS model applied to 3559 patient recordsTwo prosthodontists’ opinionOMFSCDS model was used to predict the outcome of teeth extraction through electronic dental recordsNAThe machine learning CDS was an efficient tool to predict teeth extraction outcome
Sornam and Prabhakaran [40] 2019Clinical studyLB-ABC with BPNNBPNN classifierRestorative dentistryThe BPNN classifier is compared with the LB-ABC-based BPNN classifier for dental caries classificationNMThe learning rate generated by the LB-ABC for the BPNN classifier achieved the best training and testing accuracy of 99.16%
Setzer et al. [41] 2020Clinical studyEvaluation of periapical lesion by DL methodRating by OMF radiologist, an endodontist, and a senior graduate studentEndodonticsThe CBCT segmentation was assessed by DL, CNN detectionNMDL algorithm trained in a limited CBCT environment showed excellent results in lesion detection accuracy
Cantu et al. [42] 2020Clinical studyCaries detection on bitewing radiograph with DLOpinion of four experienced dentistsOD, ORCNN (U-Net) and Intersection-over-Union were used to detect caries on radiographsNMThe deep neural network was accurate than dentists
Aliaga et al. [45] 2020Experimental studyAutomatic computation and intelligent image segmentation of 370 radiographsExpert dentist opinionOD, OMFSAutomatic computation for analysis of mandibular indices and osteoporosis detectionNMAutomatic computation of mandibular indices and intelligent image segmentation was an efficient and reliable approach for early osteoporosis detection
Kim et al. [28] 2018Case-control studyMachine learning prediction models for BRONJ after extraction in 125 patients with drug useConventional methods, serum CTX levelOMFS/OMFive machine learning methods such as logistic regression model, decision tree, support vector machine, ANN, and random forest were applied to predict BRONJ at extraction sitesNAMachine learning showed superior performance in predicting BRONJ compared with serum CTX level and drug holiday period
Dumast et al. [29] 2018Case-control study17 tested OA subjects evaluated with deep CNN on 3D images17 age and sex-matched control subjects without OAOMFSDeep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological dataNADeep neural network is a useful tool for classification of TMJOA
Sorkhabi and Khajeh [43] 2019Clinical trial3D deep CNN and CBCTPostextraction clinical parameter measurementsOD and implant dentistry3D CNN method was used to measure alveolar bone density on CBCT images6 months3D deep CNN technique can accurately classify alveolar bone. Pattern, which is helpful in dental implant placement and diagnosis

NA: not applicable; NM: not mentioned; OMFS: oral and maxillofacial surgery; OM: oral medicine; OP: oral pathology; OR: oral radiology; OD: oral diagnosis; AL: apical lesion; CNN: convolutional neural networks; ANN: artificial neural networks; 3D: three dimensional; DL: deep learning; CAL: computer-assisted learning; CAD/CAM: computer-aided design/computer-aided manufacturing; 2D: two dimensional; TMJOA: temporomandibular joint osteoarthritis; OA: osteoarthritis; BPNN: back-propagation neural networks; CDS: clinical decision support systems; BRONJ: bisphosphonate-related osteonecrosis of the jaw; LB-ABC: logit-based artificial bee colony optimization algorithm; VGG-16: Visual Geometry Group; PA: periapical radiograph; CBCT: cone-beam computerized tomography; GA: genetic algorithm; serum CTX: serum C-terminal telopeptide; AUC: area under the curve.

The included studies were ranged from the year 2000 to 2020. The studies were from four categories: cohort study [5, 16, 27], case control [28, 29], clinical trials [612, 17, 3043], and experimental trials [1315, 44, 45]. The follow-up period was mentioned in one study [43]. The various techniques of artificial intelligence was applied in the field oral and maxillofacial surgery [2729, 38, 45], oral medicine [28], oral radiology [42], esthetic dentistry [17, 44], restorative dentistry [1113, 34, 36, 40], endodontics [6, 9, 10, 41], oral diagnosis [59, 15, 33, 34, 36, 37, 42, 43, 45], orthodontics and orthognathic surgery [16, 35], forensic dentistry [8], gerodontology [39], implantology [43], periodontics [31, 32], and prosthodontics [14, 15, 30].

3.3. General Outcomes of Included Studies

The different modalities of artificial intelligence showed favorable outcomes. The deep learning with CNN’s performed well in predicting the debonding probability of CAD/CAM crowns from 3D models [12], and it functioned considerably well in detecting apical lesion and dental caries in periapical (PA) and panoramic radiography [9, 36, 40, 42]. In addition to this, it has been proved to be accurate in predicting the treatment of dental decay based on radiographic images [6].

AI also has been proved to assist dentists in implant treatment starting from diagnosis to surgery by proficient and certain radiological evaluation [43]. Further, AI aided in the detection and classification of impacted supernumerary teeth, in the maxillary incisor region on periapical radiographs [7]. Along with AI, automation of tooth segmentation can be achieved through dental panoramic images [8].

CNN has been used to classify dental arches, and multilayer CNN also improves the radiographic diagnosis of proximal caries [33]. Machine learning computer algorithmic tool also facilitates detecting and classifying dental restoration in panoramic images [5]. ANN has been used to determine accurate working length on radiographs [10]. Likewise, a neural network and web-based system was able to assists in characterization of TMJ health and temporomandibular joint osteoarthritis (TMJOA) at clinical, imaging, and biological levels [29, 30, 38]. Furthermore, the computer color matching (CCM) technique provides an accurate color matching of dental restorations, together with the automatic laser ablation system for clinical crown preparation [14, 44]. Overall, the above methods if introduced into routine practice can be helpful in diagnosis and treating dental diseases.

3.4. Results of Quality Assessment

According to the standards described in the Cochrane Handbook for Systematic Reviews of Interventions (v5.1.0) [19], the following findings were recorded. Out of the 32 studies [517, 2745] assessed, 1 study employed blinding [6]. In 5 studies, randomizations [5, 7, 34, 37, 39] were performed. The dropout rate was mentioned in 31 studies [511, 1317, 2745]. The study variables were analyzed for accuracy in 30 studies [513, 1517, 2729, 37, 3945]. Sample size was mentioned in 31 studies [517, 2733, 3545]. The inclusion and exclusion criteria were clearly mentioned in 30 studies [515, 17, 2737, 3945]. The examiner reliability was also applied in 30 studies [513, 1517, 2731, 3345]. Additionally, the outcome of study was prespecified in 28 studies [510, 1317, 2733, 3537, 3945]. The quality of 25 studies was rated as low [510, 13, 15, 17, 2729, 3133, 3945, 33–3537], whereas 7 studies were rated as having a moderate risk of biasness [11, 12, 14, 16, 30, 34, 38]. The quality assessment of the included studies is shown in Table 3.

Author and yearRandomizationBlindingWithdrawal/dropout mentionedVariables measured many timesSample size estimationInclusion/exclusion criteria clearExaminer reliability testedExpected outcomes prespecifiedQuality of study/bias risk

Abdalla-Aslan et al. [5] 2020YesNoYesYesYesYesYesYesLow
Bouchahma et al. [6] 2019NoYesYesYesYesYesYesYesLow
Kuwada et al. [7] 2020YesNoYesYesYesYesYesYesLow
Lee et al. [8] 2020NoNoYesYesYesYesYesYesLow
Ekert et al. [9] 2019NoNoYesYesYesYesYesYesLow
Saghiri et al. [10] 2012NoNoYesYesYesYesYesYesLow
Arisu et al. [11] 2018NoNoYesYesYesYesYesNoModerate
Yamaguchi et al. [12] 2019NoNoUnclearYesYesYesYesNoModerate
Otani et al. [13] 2015NoNoYesYesYesYesYesYesLow
Wang et al. [14] 2014UnclearUnclearYesNoYesYesNoYesModerate
Takahashi et al. [15] 2020NoNoYesYesYesYesYesYesLow
Patcas et al. [16] 2019NoNoYesYesYesUnclearYesYesModerate
Li et al. [17] 2020NoNoYesYesYesYesYesYesLow
Li et al. [44] 2015NoNoYesYesYesYesYesYesLow
Edinger [30] 2004UnclearUnclearYesYesYesYesNoYesModerate
Meissner et al. [31] 2006NoNoYesYesYesYesYesYesLow
Meissner et al. [32] 2005NoNoYesYesYesYesYesYesLow
Devito et al. [33] 2008NoNoYesYesYesYesYesYesLow
Kositbowornchai et al. [34] 2006YesNoYesYesNoYesYesNoModerate
Patcas et al. [35] 2019NoNoYesYesYesYesYesYesLow
Lee et al. [36] 2018NoNoYesYesYesYesYesYesLow
Vranckx et al. [37] 2020YesNoYesYesYesYesYesYesLow
Lee et al. [38] 2020NoNoYesNoYesNoYesNoModerate
Hung et al. [39] 2019YesNoYesYesYesYesYesYesLow
Cui et al. [27] 2020NoNoYesYesYesYesYesYesLow
Sornam and Prabhakaran [40] 2019NoNoYesYesYesYesYesYesLow
Setzer et al. [41] 2020NoNoYesYesYesYesYesYesLow
Cantu et al. [42] 2020NoNoYesYesYesYesYesYesLow
Aliaga et al. [45] 2020NoNoYesYesYesYesYesYesLow
Kim et al. [28] 2018NoNoYesYesYesYesYesYesLow
Dumast et al. [29] 2018NoNoYesYesYesYesYesYesLow
Sorkhabi and Khajeh [43] 2019NoNoYesYesYesYesYesYesLow

A study was graded to have a low risk of bias if it yielded 6 or more “yes” answers to the 9 questions, moderate risk if it yielded 3 to 5 “yes” answers, and high risk if it yielded 2 “yes” answers or less.

Furthermore, “the quality assessment of selected studies on NOS [20] was ranging from 4 to 8 stars.” A mean score of 7 was achieved for the included studies, as mentioned in Table 4. Thirty-one studies fall in the moderate bias category [517, 2729, 3145] while 1 study had a high risk of biasness [30].

Author and yearSelectionCompatibilityExposureNewcastle-Ottawa quality (total)

Abdalla-Aslan et al. [5] 20207
Bouchahma et al. [6] 20197
Kuwada et al. [7] 20207
Lee et al. [8] 20206
Ekert et al. [9] 20198
Saghiri et al. [10] 20127
Arisu et al. [11] 20186
Yamaguchi et al. [12] 20198
Otani et al. [13] 20157
Wang et al. [14] 20146
Takahashi et al. [15] 20207
Patcas et al. [16] 20196
Li et al. [17] 20208
Li et al. [44] 20158
Edinger [30] 20044
Meissner et al. [31] 20066
Meissner et al. [32] 20056
Devito et al. [33] 20087
Kositbowornchai et al. [34] 20066
Patcas et al. [35] 20197
Lee et al. [36] 20186
Vranckx et al. [37] 20207
Lee et al. [38] 20208
Hung et al. [39] 20197
Cui et al. [27] 20206
Sornam and Prabhakaran [40] 20197
Setzer et al. [41] 20207
Cantu et al. [42] 20208
Aliaga et al. [45] 20206
Kim et al. [28] 20187
Dumast et al. [29] 20187
Sorkhabi and Khajeh [43] 20197

A study can be awarded a maximum of 1 star for each numbered item within the selection and exposure categories. A maximum of 2 stars can be given for comparability. Each study can be awarded a total of 9 stars. A study was rated to have a low risk of biasness if it received the maximum allowed number of 9 “stars” while moderate risk if it received 8, 7, or 6 “stars” and high risk if it received 5 “stars” or less.

4. Discussion

The AI digital systems have unquestionably changed the direction of dentistry [46]. The AI modalities: machine learning, deep learning, cognitive computing, computer vision (recognizes the content in photos and videos), and natural language processing (to both analyze and generate human speech with the help of machines), are promising and practiced in dentistry [47]. Along the advent of AI better restoration, options are available with longer shelf life and superior esthetics and function [12, 13]. AI models are bringing greater efficiency and accuracy, capitalizing on the interest, capabilities, and skills of those involved [48]. Effective and efficient interprofessional and clinician-patient interactions have evolved using these new ways, with AI students have new ways of learning through research and the data collected can be efficiently utilized for forensic and epidemiological uses [49, 50]. Extensive research has been carried out on the application, benefits, and comparison of AI with human skills around the globe. The purpose of this systematic review was to investigate the quality and outcome of studies into artificial intelligence techniques, analysis, and its effect in dentistry.

Among the studies reviewed, it was revealed the application of artificial intelligence in dentistry is ample. Studies have found that the implications of AI in practice will facilitate dentists at every step. For instance, a neural network is beneficial in screening for oral cancer and precancer conditions, diagnosing bisphosphonate-related osteonecrosis before surgical removal of teeth and evaluating cervical lymph node metastasis of carcinoma after comparing it with magnetic resonance imaging [21, 28]. Furthermore, the computer color matching (CCM) technique provides an accurate color matching of dental restorations, together with the automatic laser ablation system for clinical crown preparation [14, 44]. The methodology used varied among the studies as to how the data were collected and analyzed and the AI technique developed. Therefore, a comparison of the studies was difficult.

The AI models suggested a positive impact in assisting dental diagnostics. Therefore, it can assist dentists in achieving correct interpretations of dental anomalies and minimizing human error. This review suggests that computer-based neural network plays a supporting role to dental practitioners, in decision making and minimizing errors during execution of dental treatment planning.

Furthermore, the current review proposes AI, a reliable technology for appraising the depth of dental caries, apical lesion diagnosis, working length determination, classification of dental arches, tooth segmentation, TMJ osteoarthritis, and early detection of early osteoporosis in jaws on panoramic radiographs [6, 810, 15, 36, 38, 45]. Rekow used a machine-learning algorithm to detect and classify dental restorations on panoramic images [25]. Kuwada et al. revealed that “DetectNet and AlexNet” appeared potentially useful in classifying the presence of impacted supernumerary teeth in the maxillary incisor region on panoramic radiographs [7]. Drevenstedt et al. used voice commands for recording patients’ history and data, making suggestions during an ongoing dental procedure, scheduling patients’ appointment, reminders for routine checkups, and necessary dental consultations [51]. The artificial neural network (ANN) models using bitewing photographs showed 97.1% accuracy for the dental caries diagnosis, 95.1% precision, a specificity of 94.3%, and a sensitivity ranging from 85% to 99.6% [42]. Sornam and Prabhakaran depicted an accuracy ranging from 85 to 100% using the AI model, “back-propagation neural networks” (BPNN) in dental caries classification [40]. However, comparisons among the studies were difficult because of differences in the methods used.

Despite the fact that the outcome of reviewed studies is auspicious, this study has few limitations. For example, the quality assessment of the literature conceded that there is a possibility of bias. The complexity of a particular system or mechanism, cost, and equipment of each setup need to be considered, including the training required for each AI model. Further research, exposure, and implementation are required. The worthwhile outcomes are not achieved yet due to the unavailability of accurate and sufficient data. In short, challenges exist both in technical and ethical aspects.

Nonetheless, in the future, the AI-based comprehensive care system will analyze big data including faciodental images and other records. AI models will provide reliable information and refined the clinical decision-making process. Infect AI is expected to establish high-quality patient care, innovative research, and state-of-the-art development in dentistry. Artificial intelligence and machine learning will aid to automation of aesthetic evaluation, smile design, and oral rehabilitation. By far, a change is not easy to adapt, but gradually, the application of AI in dental practice will become a necessity and might drive patient’s demand too. AI has the proven ability to rationalize and take actions in the best manner of achieving a specific goal; this automated model can easily execute tasks, from simple to complex in nature.

5. Conclusions

At present, AI has been used vastly in dentistry. It has the potential to revolutionize oral health care by assisting in addressing the weaknesses grimly criticized in conventional dental care.

Based on the findings of this systematic review, it was concluded that (1)AI techniques assist dental practitioners in numerous ways, from decreasing the chairside time, saving extra steps, achieving excellent infection control, and providing quality treatment with accuracy and precision(2)AI can be successfully used for patient diagnosis, clinical decision making, and prediction of dental failures. Hence, it is a reliable modality for future application in oral diagnosis, oral and maxillofacial surgery, restorative dentistry, prosthodontics, orthodontics, endodontics, forensic dentistry, radiology, and periodontics(3)However, AI is increasing the scope of state-of-the-art models in dentistry but is still under development. Further studies are required to assess the clinical performance of AI techniques in dentistry

Data Availability

The raw data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare no conflict of interest.

Authors’ Contributions

NA, FZ, and WQ planned and designed the present work, and MSA was responsible for realizing the work. NA, MSA, FZ, and WQ were responsible for the data acquisition and analysis. NA, MSH, and AM drafted and revised the manuscript. NA, MKA, MSH, and AM approved the final version of the manuscript. All authors read and approved the final manuscript. MSH and MKA contributed equally to this work and are corresponding authors.


The authors thank Mr. Muhammad Hashim and Dr. Zohaib Khurshid for guidance and help regarding English language editing and ideas on Figures 1 and 2 in this study. We are also grateful to University Sains Malaysia, Jouf University, Kingdom of Saudi Arabia, and Altamash Institute of Dental Medicine, Pakistan, for the support and facilitation in this study.


  1. A. Barr, E. A. Feigenbaum, and P. R. Cohen, The Handbook of Artificial Intelligence, vol. 1-3, William Kaufmann Inc., Los Altos, CA, 1981.
  2. F. Schwendicke, W. Samek, and J. Krois, “Artificial intelligence in dentistry: chances and challenges,” Journal of Dental Research, vol. 99, no. 7, pp. 769–774, 2020. View at: Publisher Site | Google Scholar
  3. Y. W. Chen, K. Stanley, and W. Att, “Artificial intelligence in dentistry: current applications and future perspectives,” Quintessence International, vol. 51, no. 3, pp. 248–257, 2020. View at: Publisher Site | Google Scholar
  4. M. Cozzani, D. Sadri, L. Nucci, P. Jamilian, A. P. Pirhadirad, and A. Jamilian, “The effect of Alexander, Gianelly, Roth, and MBT bracket systems on anterior retraction: a 3-dimensional finite element study,” Clinical Oral Investigations, vol. 24, no. 3, pp. 1351–1357, 2020. View at: Publisher Site | Google Scholar
  5. R. Abdalla-Aslan, T. Yeshua, D. Kabla, I. Leichter, and C. Nadler, “An artificial intelligence system using machine-learning for automatic detection and classification of dental restorations in panoramic radiography,” Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology, vol. 130, no. 5, pp. 593–602, 2020. View at: Publisher Site | Google Scholar
  6. M. Bouchahma, S. Ben Hammouda, S. Kouki, M. Alshemaili, and K. Samara, “An automatic dental decay treatment prediction using a deep convolutional neural network on X-ray images,” in 2019 IEEE/ACS 16th International Conference on Computer Systems and Applications (AICCSA), pp. 1–4, Abu Dhabi, United Arab Emirates, 2019. View at: Publisher Site | Google Scholar
  7. C. Kuwada, Y. Ariji, M. Fukuda et al., “Deep learning systems for detecting and classifying the presence of impacted supernumerary teeth in the maxillary incisor region on panoramic radiographs,” Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology, vol. 130, no. 4, pp. 464–469, 2020. View at: Publisher Site | Google Scholar
  8. J. H. Lee, S. S. Han, Y. H. Kim, C. Lee, and I. Kim, “Application of a fully deep convolutional neural network to the automation of tooth segmentation on panoramic radiographs,” Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology, vol. 129, no. 6, pp. 635–642, 2020. View at: Publisher Site | Google Scholar
  9. T. Ekert, J. Krois, L. Meinhold et al., “Deep learning for the radiographic detection of apical lesions,” Journal of endodontics, vol. 45, no. 7, pp. 917–922.e5, 2019. View at: Publisher Site | Google Scholar
  10. M. A. Saghiri, F. Garcia-Godoy, J. L. Gutmann, M. Lotfi, and K. Asgar, “The reliability of artificial neural network in locating minor apical foramen: a cadaver study,” Journal of Endodontia, vol. 38, no. 8, pp. 1130–1134, 2012. View at: Publisher Site | Google Scholar
  11. H. Deniz Arısu, E. Eligüzeloglu Dalkilic, F. Alkan, S. Erol, M. B. Uctasli, and A. Cebi, “Use of artificial neural network in determination of shade, light curing unit, and composite parameters’ effect on bottom/top Vickers hardness ratio of composites,” BioMed Research International, vol. 2018, Article ID 4856707, 9 pages, 2018. View at: Publisher Site | Google Scholar
  12. S. Yamaguchi, C. Lee, O. Karaer, S. Ban, A. Mine, and S. Imazato, “Predicting the debonding of CAD/CAM composite resin crowns with AI,” Journal of Dental Research, vol. 98, no. 11, pp. 1234–1238, 2019. View at: Publisher Site | Google Scholar
  13. T. Otani, A. J. Raigrodski, L. Mancl, I. Kanuma, and J. Rosen, “In vitro evaluation of accuracy and precision of automated robotic tooth preparation system for porcelain laminate veneers,” The Journal of Prosthetic Dentistry, vol. 114, no. 2, pp. 229–235, 2015. View at: Publisher Site | Google Scholar
  14. L. Wang, D. Wang, Y. Zhang, L. Ma, Y. Sun, and P. Lv, “An automatic robotic system for three-dimensional tooth crown preparation using a picosecond laser,” Lasers in Surgery and Medicine, vol. 46, no. 7, pp. 573–581, 2014. View at: Publisher Site | Google Scholar
  15. J. Takahashi, K. Nozaki, T. Gonda, and K. Ikebe, “A system for designing removable partial dentures using artificial intelligence. Part 1. Classification of partially edentulous arches using a convolutional neural network,” Journal of Prosthodontic Research, vol. 65, no. 1, pp. 115–118, 2021. View at: Publisher Site | Google Scholar
  16. R. Patcas, D. A. J. Bernini, A. Volokitin, E. Agustsson, R. Rothe, and R. Timofte, “Applying artificial intelligence to assess the impact of orthognathic treatment on facial attractiveness and estimated age,” International Journal of Oral and Maxillofacial Surgery, vol. 48, no. 1, pp. 77–83, 2019. View at: Publisher Site | Google Scholar
  17. M. Li, X. Xu, K. Punithakumar, L. H. Le, N. Kaipatur, and B. Shi, “Automated integration of facial and intra-oral images of anterior teeth,” Computers in Biology and Medicine, vol. 122, article 103794, 2020. View at: Publisher Site | Google Scholar
  18. D. Moher, A. Liberati, J. A. Tetzlaff, D. G. Altman, and PRISMA Group, “Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement,” Annals of Internal Medicine, vol. 151, no. 4, pp. 264–9, W64, 2009. View at: Publisher Site | Google Scholar
  19. J. P. Higgins and S. Green, Cochrane Handbook for Systematic Reviews of Interventions, vol. 4, John Wiley & Sons, 2011.
  20. G. A. Wells, B. Shea, and D. O’Connell, The Newcastle-Ottawa Scale (NOS) for Assessing the Quality of Nonrandomized Studies in Meta-Analyses, Ottawa Hospital Research Institute, Ottawa, ON, Canada, 2011.
  21. A. R. Gould, “Early detection of oral premalignant disease and oral cancer: refining the process,” Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology, and Endodontics, vol. 94, no. 4, pp. 397-398, 2002. View at: Publisher Site | Google Scholar
  22. W. J. Van der Meer, A. Vissink, Y. L. Ng, and K. Gulabivala, “3D computer aided treatment planning in endodontics,” Journal of Dentistry, vol. 45, pp. 67–72, 2016. View at: Publisher Site | Google Scholar
  23. V. Vera, E. Corchado, R. Redondo, J. Sedano, and A. E. Garcia, “Applying soft computing techniques to optimise a dental milling process,” Neurocomputing, vol. 109, no. 109, pp. 94–104, 2013. View at: Publisher Site | Google Scholar
  24. D. Leeson, “The digital factory in both the modern dental lab and clinic,” Dental Materials, vol. 36, no. 1, pp. 43–52, 2020. View at: Publisher Site | Google Scholar
  25. E. D. Rekow, “Digital dentistry: the new state of the art -- is it disruptive or destructive?” Dental Materials, vol. 36, no. 1, pp. 9–24, 2020. View at: Publisher Site | Google Scholar
  26. G. I. McCracken, J. H. Nunn, R. S. Hobson, J. J. Stephenson, and N. J. Jepson, “Evaluation of a computer- assisted learning package on the management of traumatised incisors by general dental practitioners,” Endodontics & Dental Traumatology, vol. 16, no. 1, pp. 40–42, 2000. View at: Publisher Site | Google Scholar
  27. Q. Cui, Q. Chen, P. Liu, D. Liu, and Z. Wen, “Clinical decision support model for tooth extraction therapy derived from electronic dental records,” The Journal of Prosthetic Dentistry, vol. 20, pp. S0022–S3913, 2020. View at: Publisher Site | Google Scholar
  28. D. W. Kim, H. Kim, W. Nam, H. J. Kim, and I. H. Cha, “Machine learning to predict the occurrence of bisphosphonate-related osteonecrosis of the jaw associated with dental extraction: a preliminary report,” Bone, vol. 116, pp. 207–214, 2018. View at: Publisher Site | Google Scholar
  29. P. de Dumast, C. Mirabel, L. Cevidanes et al., “A web-based system for neural network based classification in temporomandibular joint osteoarthritis,” Computerized Medical Imaging and Graphics, vol. 67, pp. 45–54, 2018. View at: Publisher Site | Google Scholar
  30. D. H. Edinger, “Accuracy of a robotic system for the reproduction of condylar movements: a preliminary report,” Quintessence International, vol. 35, no. 7, pp. 519–523, 2004. View at: Google Scholar
  31. G. Meissner, B. Oehme, J. Strackeljan, and T. Kocher, “In vitro calculus detection with a moved smart ultrasonic device,” Journal of Clinical Periodontology, vol. 33, no. 2, pp. 130–134, 2006. View at: Publisher Site | Google Scholar
  32. G. Meissner, B. Oehme, J. Strackeljan, and T. Kocher, “Influence of handling-relevant factors on the behaviour of a novel calculus-detection device,” Journal of clinical periodontology, vol. 32, no. 3, pp. 323–328, 2005. View at: Publisher Site | Google Scholar
  33. K. L. Devito, F. de Souza Barbosa, and W. N. F. Filho, “An artificial multilayer perceptron neural network for diagnosis of proximal dental caries,” Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology, and Endodontics, vol. 106, no. 6, pp. 879–884, 2008. View at: Publisher Site | Google Scholar
  34. S. Kositbowornchai, S. Siriteptawee, S. Plermkamon, S. Bureerat, and D. Chetchotsak, “An artificial neural network for detection of simulated dental caries,” International Journal of Computer Assisted Radiology and Surgery, vol. 1, no. 2, pp. 91–96, 2006. View at: Publisher Site | Google Scholar
  35. R. Patcas, R. Timofte, A. Volokitin et al., “Facial attractiveness of cleft patients: a direct comparison between artificial-intelligence-based scoring and conventional rater groups,” European journal of orthodontics, vol. 41, no. 4, pp. 428–433, 2019. View at: Publisher Site | Google Scholar
  36. J. H. Lee, D. H. Kim, S. N. Jeong, and S. H. Choi, “Detection and diagnosis of dental caries using a deep learning-based convolutional neural network algorithm,” Journal of Dentistry, vol. 77, pp. 106–111, 2018. View at: Publisher Site | Google Scholar
  37. M. Vranckx, A. Van Gerven, H. Willems et al., “Artificial intelligence (AI)-driven molar angulation measurements to predict third molar eruption on panoramic radiographs,” International Journal of Environmental Research and Public Health, vol. 17, no. 10, p. 3716, 2020. View at: Publisher Site | Google Scholar
  38. K. S. Lee, H. J. Kwak, J. M. Oh et al., “Automated detection of TMJ osteoarthritis based on artificial intelligence,” Journal of Dental Research, vol. 99, no. 12, pp. 1363–1367, 2020. View at: Publisher Site | Google Scholar
  39. M. Hung, M. W. Voss, M. N. Rosales et al., “Application of machine learning for diagnostic prediction of root caries,” Gerodontology, vol. 36, no. 4, pp. 395–404, 2019. View at: Publisher Site | Google Scholar
  40. M. Sornam and M. Prabhakaran, “Logit-based artificial bee colony optimization (LB-ABC) approach for dental caries classification using a back propagation neural network,” in Integrated Intelligent Computing, Communication and Security, vol. 771, pp. 79–91, Springer, Singapore, 2019. View at: Publisher Site | Google Scholar
  41. F. C. Setzer, K. J. Shi, Z. Zhang et al., “Artificial intelligence for the computer-aided detection of periapical lesions in cone-beam computed tomographic images,” Journal of Endodontia, vol. 46, no. 7, pp. 987–993, 2020. View at: Publisher Site | Google Scholar
  42. A. G. Cantu, S. Gehrung, J. Krois et al., “Detecting caries lesions of different radiographic extension on bitewings using deep learning,” Journal of Dentistry, vol. 100, article 103425, 2020. View at: Publisher Site | Google Scholar
  43. M. M. Sorkhabi and M. Saadat Khajeh, “Classification of alveolar bone density using 3-D deep convolutional neural network in the cone-beam CT images: a 6-month clinical study,” Measurement, vol. 148, article 106945, 2019. View at: Publisher Site | Google Scholar
  44. H. Li, L. Lai, L. Chen, C. Lu, and Q. Cai, “The prediction in computer color matching of dentistry based on GA+BP neural network,” Computational and Mathematical Methods in Medicine, vol. 2015, Article ID 816719, 7 pages, 2015. View at: Publisher Site | Google Scholar
  45. I. Aliaga, V. Vera, M. Vera, E. García, M. Pedrera, and G. Pajares, “Automatic computation of mandibular indices in dental panoramic radiographs for early osteoporosis detection,” Artificial Intelligence in Medicine, vol. 103, article 101816, 2020. View at: Publisher Site | Google Scholar
  46. J. Grischke, L. Johannsmeier, L. Eich, L. Griga, and S. Haddadin, “Dentronics: towards robotics and artificial intelligence in dentistry,” Dental Materials, vol. 36, no. 6, pp. 765–778, 2020. View at: Publisher Site | Google Scholar
  47. S. S. Khanna and P. A. Dhaimade, “Artificial intelligence: transforming dentistry today,” Indian Journal Of Basic And Applied Medical Research, vol. 6, no. 3, pp. 161–167, 2017. View at: Google Scholar
  48. T. Joda, M. M. Bornstein, R. E. Jung, M. Ferrari, T. Waltimo, and N. U. Zitzmann, “Recent trends and future direction of dental research in the digital era,” International Journal of Environmental Research and Public Health, vol. 17, no. 6, p. 1987, 2020. View at: Publisher Site | Google Scholar
  49. T. Shan, F. R. Tay, and L. Gu, “Application of artificial intelligence in dentistry,” Journal of Dental Research, vol. 100, no. 3, pp. 232–244, 2021. View at: Publisher Site | Google Scholar
  50. F. Schwendicke, T. Singh, J. H. Lee et al., “Artificial intelligence in dental research: checklist for authors, reviewers, readers,” Journal of Dentistry, vol. 107, article 103610, 2021. View at: Publisher Site | Google Scholar
  51. G. Drevenstedt, J. Mcdonald, and L. Drevenstedt, “The role of voice-activated technology in today's dental practice,” Journal of the American Dental Association (1939), vol. 136, no. 2, pp. 157–161, 2005. View at: Publisher Site | Google Scholar

Copyright © 2021 Naseer Ahmed et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Related articles

No related content is available yet for this article.
 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

No related content is available yet for this article.

Article of the Year Award: Outstanding research contributions of 2021, as selected by our Chief Editors. Read the winning articles.