Interaction and Experience Design and Evaluation Issues in Mobile Information SystemsView this Special Issue
Research Article | Open Access
Jung-Sing Jwo, Ching-Sheng Lin, Cheng-Hsiung Lee, "An Interactive Dashboard Using a Virtual Assistant for Visualizing Smart Manufacturing", Mobile Information Systems, vol. 2021, Article ID 5578239, 9 pages, 2021. https://doi.org/10.1155/2021/5578239
An Interactive Dashboard Using a Virtual Assistant for Visualizing Smart Manufacturing
In the era of Industry 4.0, manufacturing sites are becoming more sophisticated and connected with the aid of Information Technology (IT) and Industrial Internet of Things (IIoT) infrastructure. What makes a smart manufacturing enterprise as opposed to a traditional one is the ability to solve existing problems and predict issues to fix them before they occur while creating advantaged value. Dashboards, assisting decision-makers in business markets, can organize data from machines, sensors, and workers into a real-time visual representation. They provide a quick overview of how your entire business is operating and where you stand in relation to the key performance indicators for supporting the development toward more resource-efficient and sustainable processes. However, if users need more information related to the indicators shown on the dashboard, current approaches mainly rely on human effort to search and retrieve the information. The lack of interaction between the dashboard and users often results in inconvenient operations and limits the usage of the tool, especially in the manufacturing industry. In this study, an interaction design is developed to resolve the issue. The proposed dashboard is developed by a three-layer structure including data layer, processing layer, and presentation layer. A virtual assistant is introduced to play the role of a mediator between the dashboard and the user. It provides users with a natural language interface to communicate, directly interacts with the dashboard on users’ behalf, and actively reminds relevant personnel to expedite data collection. Compared with the traditional human-machine interface, the adoption of this lightweight and easy-to-use design can not only create a frictionless and intuitive experience but also ensure that data is displayed in a timely manner.
The manufacturing industry faces increasingly immense challenges in a dynamic and highly competitive global marketplace . The growth of data is exceeding the capacity of traditional approaches to accommodate. For example, in a Consumer Packaged Goods company, one product line for producing a personal care product is able to generate 5000 data samples every 33 milliseconds, resulting in 152,000 samples per second or four trillion (i.e., 4 Tera) samples per year . This fact has introduced the industrial big data epoch and the question becomes how do we handle these data sources and convert the results to usable formats? Since operators could monitor the status of the process for reacting to the rising issues and leadership has to see exactly what’s going on to ensure the entire business is operating efficiently, comprehensive information provision on the shop floor is necessary.
According to the definition , “a dashboard is a single-screen display that shows this important information about a company so that the whole situation, for example, in a factory or on a production line, can be quickly understood.” The dashboard concept is not something new in business and has been developed in the high-tech industry for decades to analyze and display key performance indicators (KPIs). Dashboards are able to be customized in order to meet the specific requirements for different purposes and across different industries. However, in the manufacturing industry, it is only recently that the gathering and analysis of data from different sources are becoming more available. Subsequently, the desire to turn data into valuable insights and action makes manufacturing dashboards increasingly attractive, even in small and medium-sized enterprises (SMEs). In general, dashboards have a number of advantages which include a visual display of key information, real-time information display, and customization and personalization [4–6].
In the manufacturing sites, there are two main challenges in the usage of dashboards. First, the interaction with the dashboard generally means viewing the visualization results generated from the data. Any other requests for the dashboard may need to use a touchscreen or a mouse to control . Furthermore, if the user has questions about the content shown on the dashboard and needs more detailed information of a specific indicator, it may need an assistant or a technician to help execute tasks. These will cause inconvenience and require extra human involvement. Second, data availability and integrity is another issue to be considered . For example, due to its special culture, operators often tend to input data altogether until their duties are complete in order to save more time. Often, this could lead to missing values and result in inaccurate visualizations. Our research questions for this study are as follows. RQ1: How to develop an effective and intuitive interaction method for users to communicate with the dashboards to reduce human efforts and burden? RQ2: How can the data collection process be improved in order to provide more real-time data for the dashboards?
In this paper, a virtual assistant (VA) is proposed to play the role of a mediator between the dashboard and the user. The VA provides a natural language interface for users to communicate with the dashboard. The user could directly ask the VA to perform some operations on the dashboard, such as switching to the view of “the real-time equipment status.” Moreover, it is able to receive commands from users about the information inquiry and then drill down into the data in response to the input question. To resolve the missing data issue, the VA could proactively contact the relevant personnel to remind them to enter data on humans’ behalf. The contribution of this paper is twofold: (1)A human-centered design methodology using a VA is described. As the VA is easily accessible, employees in the manufacturing industry are capable of installing and using it to interact with the dashboard for the information inquiry without requiring a steep learning curve.(2)Instead of relying on human labor to deal with the data collection activities, the VA plays an active role in making requests to related personnel when missing data are found. In this way, even SMEs can save time and reduce the human cost to guarantee the data quality and gradually move toward smart manufacturing.
The remainder of this paper is organized as follows. Section 2 reviews advanced techniques related to the work of this paper. The proposed design and system architecture are described in Section 3. Example scenarios and the implementation are discussed in Section 4. Section 5 presents conclusions and summarizes future research directions.
2. Related Work
Dashboards in this paper are the main interface between the backend systems and users. They, first appearing in the early 1990s and having been of interest to researchers, provide a means of summarizing important information and visualizing with graphical components such as charts, graphs, diagrams, and maps . From the perspective of design philosophy, there is a visual genre of dashboards and a functional genre. In general, the visual genre represents data and information by a tiled layout with simple graphs and numbers, whereas the functional genre involves an interactive display to enable real-time monitoring and support infographic elements or narrative visualizations .
Dashboards were mainly developed for business management and decision support but gradually have been employed to resolve the needs for various domains and purposes. As modern cities are increasing in multiple dimensions (e.g., population, geographical size, and economic activities), it is important to provide real-time awareness applications with the incorporation of sensor data and network infrastructure. Urban dashboards are used to visualize the information of the city’s current state and offer communication between city operators and citizens [6, 9]. The health-related issue is a major concern nowadays. COVID-19 is the latest threat which becomes the major health concern and changes the behavior of human, business and, most importantly, the way of life. The adaptation of dashboards to track COVID-19 makes the information more accessible to the public for providing greater clarity and to the decision-makers for monitoring health surveillance [11, 12]. In the manufacturing industry, the rapid development of the Internet of Things (IoT) has produced a massive volume of data. Although dashboards are able to visualize the business performance information, research of what information should be on dashboards is less discussed. Tokola et al.  conducted a comprehensive survey to investigate the preferred KPIs and usage on different hierarchy levels (workers, managers, and executives). Based on the feedback from the survey, three manufacturing dashboards are designed, respectively: an operational dashboard for workers, a tactical dashboard for managers, and a strategy dashboard for executives. Although dashboards have been designed for a range of stakeholders to support insight into learning data, the research of interaction design on the dashboard still needs to be addressed.
The VA is a mediator between the dashboard and the user in this study. Due to the increasing prevalence of smartphones, the VA is widely involved in human daily life ranging from scheduling wake-up calls to managing appointments. It is an intelligent agent with a natural language interface to execute tasks or services for users. Natural language processing is the key technique of the VA, and there is a need to leverage its classical components pipelines such as automatic speech recognition (ASR), natural language understanding (NLU), and natural language generation (NLG) [13, 14]. ASR is the process of translating human speech into texts. NLU aims at recognizing the intent of user utterances by performing text preprocessing and analyzing semantics. Subsequently, once the task of the given intent has been executed and the answer of the user inquiry is obtained, NLG will produce meaningful context in response to the user. In the early stages, these three research areas generally focus on feature based approaches and recently have shifted to deep neural network methods with growing interest in models which are trainable in an end-to-end manner [15–17].
Because of the accessibility and ease of use, more and more organizations have launched VAs for business and professional needs. In order to integrate machines and devices into the existing manufacturing IT systems, the concept of Manufacturing Integration Assistant (MIALinx) is proposed to address the issue . It provides a lightweight solution to simplify the integration procedure by reusable “IF-THEN” rules which are defined by domain experts for the industry 4.0 environment. In the health care field, many AI applications have been adopted to improve medical services and the VA is no exception. Based on the survey to know how physician time is spent, physicians spend about half of the office time dealing with electronic health record (EHR) but less than one-third of clinical face time engaging with patients . Several virtual medical assistants such as Nuance  and Robin Healthcare  are proposed to facilitate and automate the process of clinical documents. A Medical Instructed Real-time Assistant (MIRA) is introduced to diagnose a disease by listening to the users’ complaints and refer them to an appropriate medical specialist nearby as a final recommendation . Although the VA has been employed in firms, its application in the interaction design and data collection still lacks study and needs to be researched in detail.
3. System Design and Architecture
Since human is the most valuable asset in the manufacturing industry, it is urgent to consider human operations an integral part of the manufacturing process and take into account the human-in-the-loop for the interaction design. Based on this design principle, we employ the virtual assistant as a means to aid interaction between humans and dashboards and also accelerate the data collection. The proposed architecture is presented in Figure 1, consisting of two components, namely, the dashboard module and the virtual module. The dashboard module, designed by a three-layer structure, controls the visualization task where the data layer is responsible for the data exchange, the processing layer focuses on the data organization and the presentation layer governs the display of key information. The virtual assistant module primarily handles the requests from users, manipulates the display of the dashboard, and contacts the relative personnel for data input. The user sits in between the two modules, thereby managing the data flow between the two ends. The ultimate goal of the proposed work is to enhance the interaction between users and the dashboard and give users a more natural interface to obtain insights while driving the data collection in a novel and productive way with the adoption of the virtual assistant. In the following subsections, we will describe more details about the architecture, components, and workflow of each module.
The proposed dashboard architecture (top dashed rectangle in Figure 1) using a bottom-up design approach consists of three layers: (1) data layer, (2) processing layer, and (3) presentation layer. The data layer aggregates data from different sources and is a crucial requirement for a dashboard to function. The processing layer analyzes the data acquired from the data layer and prepares for the visualization. The presentation layer displays the important indicators for users and reflects the results of user queries.
As data is the main factor in creating information, data layer plays a fundamental role in the dashboard architecture. It is responsible for providing large volumes of data to be processed for the visualization. In the manufacturing environment, we categorize the data sources into two types which are real-time data and historical data. During the manufacturing process, there is a large amount of real-time data generated and needed to be properly stored. Various IoT devices such as Radio Frequency Identification (RFID) and smart sensors are used for collecting data in real-time . RFID technology has been generally applied in manufacturing shop floors to facilitate data capture and identification automatically. Smart sensors are an integral part of many manufacturing applications to help collect data and monitor the manufacturing process about sound, temperature, vibration, pressure, and operational status, etc. Meanwhile, historical data are stored in the database and information systems such as Enterprise Resource Planning (ERP), Manufacturing Execution Systems (MES), and Supply Chain Management (SCM) . These legacy systems make data exchange easier within the organization and possess a variety of data that could be analyzed for failure prediction, quality monitoring, and shop-floor production management.
While data are collected and stored in the data layer, other specific procedures are required to extract the data and generate suitable representations for the visualization. The processing layer is the middle layer of the three-layer architecture to connect the data layer and the presentation layer. For the structured data from manufacturing information systems like ERP, MES, and CRM, it needs to define the way for an external program to interact with these systems. An Application Program Interface (API) is the middleman between the system and a program to control the data to be requested and returned. In addition to data stored in information systems, the manufacturing industry would collect large amounts of data from sensors and store it in the database. As real-time data are frequently updated, preprocessing is an important step to save time and effort. Once the data are ready, we process and analyze them to extract valuable information for the key indicators. Then, the presentation layer and the VA have direct access to these analytics data through APIs.
Data visualization is an aspect to model the core understanding of the data and provides an overview of selected KPIs status for users with real-time monitoring. The presentation layer aims at mapping processed information to a visual representation enabling the management of industrial production. Identifying important and appropriate KPIs that should be used in the dashboard is crucial to the success of the technology. For those indicators and information which are not suitable for display, other tools, such as the VA, are required to offer further inquiries. Moreover, flexible usage of the user interface is necessary for the interaction and we reserve a panel that is able to show the corresponding content in response to users’ requests. Since the target users of the dashboard are generally without strong information technology professionals, the design and layout should be intuitive and self-explanatory. It is, therefore, imperative for design experts and users to cooperate with each other to achieve agreement on appearance and presentation standards before the implementation. Visual components for conveying important information contain traditional graphs (such as line graphs, bar charts, pie charts, and histograms, etc.), tables, images (pictures and videos), and diagrams (such as trees, graphs, networks) . Colors and geometric symbols could be used to indicate the number, amount, and statistics, demonstrating the severity of the problems or the progress of execution .
3.2. Virtual Assistant
The proposed VA is designed to receive commands from users, execute the corresponding actions, and interact with the dashboard and the related personnel. The architecture of this part contains four components which are speech user interface, natural language understanding, dialogue management, and natural language generation (bottom dashed rectangle in Figure 1).
The speech user interface (SUI) module is becoming the primary way for users to control a machine and its backend assistant. Because of the rising popularity of smartphones, we develop our VA as an application and use SUI as the main input channel. The speech recognition engine, which supports natural communication, automatically transforms user’s voice input into text format. The natural language understanding (NLU) module identifies the intent of the user’s input by mapping the utterance to a predefined class. The dialogue management (DM) module is responsible for conversation understanding, state tracking, and output generation. This module initially assigns an identifier for each dialogue. For each turn in a dialogue, it estimates the user’s goal, maintains the dialogue state, takes the corresponding action to satisfy the user’s request, and generates the relevant response as an input for the natural language generation (NLG). Our action types include calling an API to retrieve data/information, interacting with the dashboard to change the content and initializing a new dialogue to communicate with the other user for the purpose of data collection. While viewing the dashboard, users may want to drill down into further details. We implement APIs to fulfill the functions and provide access to available resources. To directly control the content of the dashboard, we also leverage APIs to achieve objectives, as discussed in Section 3.1. A common situation in manufacturing is incomplete data. Once the user asks a question where the answer is not unavailable, the VA will start a new dialogue session to proactively reach out to the relevant individual, who is responsible for the answer, to give a reminder. This is a very important step toward data-driven smart manufacturing . Therefore, the VA is playing not only an information-providing role but also an information-collecting role. The NLG module is in charge of generating natural language text given by the response information from the DM module. In this study, we employ the template-based approach to produce candidate sentences that are finally fed to the text-to-speech engine to make a voice response to the user.
4. Implementation and Results
The virtual assistant is implemented as an app that could be installed on Android-based mobile devices. For speech recognition and text-to-speech conversions, we use Google Cloud Speech API to accomplish these two tasks. Microsoft Cognitive Services Language Understanding service (LUIS) is a cloud-based service that uses machine learning approaches to predict sentence meaning and extract relevant detailed information. We train the LUIS model for the NLU module, mainly for the intent prediction.
The manufacturing-based dashboard has a long and narrow shape with resolution and contains seven views as shown in Figure 2. Our design concept is to place similar indices closer and arrange the interactive region in the central area. For further discussion of each view, we decompose the dashboard into seven views from left to the right in Figure 2 and demonstrate in Figure 3. The 1st (Figure 3(a)) is about the indices of equipment which include the real-time equipment status, availability of the equipment, number of equipment maintenance past the due date, and number of equipment maintenance in the past 30 days. Hereafter, the index arrangement of each view is from the top left, counterclockwise. Tables are presented to track the detailed status of the equipment and counter tiles are used to indicate the amount of equipment maintenance. The 2nd (Figure 3(b)) is related to the abnormal data of the equipment for users to monitor the equipment health and review the historical situations. There are four indices in this view, a number of alarms for each device, today’s abnormal events, abnormal reasons in the past 30 days, and yesterday’s abnormal reasons. Bar Charts are useful for making comparisons between abnormal reasons and displaying the historical abnormal events of each device. The 3rd view (Figure 3(c)) is concerning the production progress consisting of the defective rate in the past three months, daily production this month, today’s actual production, and actual production of this month. By observing these components, users are able to know the up-to-date production data and the quality information. The 4th (Figure 3(d)) placed in the middle of the dashboard is the primary view and is twice larger than other views. It is essentially dedicated to displaying the interaction result between users and the dashboard such as showing the live scenes of the shop floor and enlarging the “yesterday’s abnormal reasons.” In this process, the user gives commands to the VA which invokes the API to show the result on the dashboard. The 5th view (Figure 3(e)) contains the current time, energy consumption status, environment monitoring, today’s rate of work-order completion, and today’s detailed production. This view is introduced to monitor environment and utility consumption and understand the work-order performance. The 6th view (Figure 3(f)) relies on speedometers to highlight the equipment and production evaluations including overall equipment availability, overall equipment effectiveness, production rate, and production yield rate. The 7th view (Figure 3(g)) focuses on the production lines and equipment statistics containing production schedule, statistics of historical equipment status, historical status data of each equipment, and statistics of real-time equipment status. Users are able to examine the analysis of production line and inspect the statistic of equipment status in terms of real time and historical data. Finally, we create a dashboard in a demonstration room which is a 300-degree circular space and designed to give users a complete and interactive experience as shown in Figure 4.
We implement the VA as an app on android-based smart phone and show the interface in Figure 5. The user has to push the microphone button on the touch screen to talk and the VA will display the text results. The corresponding information and graphic results will be updated on the dashboard as well. In the following, we will discuss three typical types of scenarios that are developed. Note that Xiaodong is the name of our virtual assistant.(1)Type 1: When users view the dashboard and require additional information, the VA is capable of the voice commands and answering the question through APIs. In Scenario 1, the user asks the number of yesterday’s defective items and the VA responds with the amount verbally .(2)Type 2: Interaction with the dashboard is an important function of the VA. Users can ask the VA to control the dashboard by commands. Currently, the interaction modes contain showing the live scenes and enlarging one of the specific views on the dashboard. In Scenario 2, the user asks the VA to show the scenes in the factory and the result is similar to the one displayed in the middle view of Figure 4.(3)Type 3: The last type of scenario is about the initiative of the VA. Once the user raises a question, but the VA could not determine the answer due to the incomplete or missing data, it will then proactively reach out to the relevant personnel. In Scenario 3, User_A inquires about today’s defective items, which need input from User_B. The VA will then initiate a new dialogue to remind User_B on User_A’s behalf .
Since the proposed solution is primarily intended for use in the manufacturing industry, we compare it with human-machine interfaces (HMIs) used in manufacturing systems and show in Table 1. The proposed approach has some distinct advantages over the HMIs. First, it offers more flexibility in terms of interaction techniques and portability. Second, the mobile interface provides users with an easy-to-learn and user-friendly environment. Third, regarding the data collection, the virtual assistant could actively contact users for the data input while the traditional HMIs do not support this functionality.
In this study, an interactive design is proposed to resolve two practical issues in the manufacturing sites, which include how to increase the communication between dashboards and users and how to improve data completeness. The design explores the interaction between the dashboard and humans through the virtual assistant, which also plays a proactive role in seeking missing and incomplete data. The proposed dashboard has a three-layer structure which is composed of data layer, processing layer, and presentation layer. The virtual assistant not only communicates with users to execute the given tasks but also could actively contact the related staff to ensure data completeness. We create a dashboard in a demonstration room and implement a voice-based virtual assistant integrated into the smartphone app as the case study. By adopting this proposed approach, users can interact with the dashboard and, most importantly, perform in-depth queries with support from the virtual assistant.
The data used to support this study have not been made available yet, as the supplier prevents this.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
The authors gratefully acknowledge the experimental support for this work from Tunghai University.
- S. Pfeiffer, “The vision of “Industrie 4.0” in the making-a case of future told, tamed, and traded,” Nanoethics, vol. 11, no. 1, pp. 107–121, 2017.
- S. Yin and O. Kaynak, “Big data for modern industry: challenges and trends [point of view],” Proceedings of the IEEE, vol. 103, no. 2, pp. 143–146, 2015.
- S. Few, Information Dashboard Design: The Effective Visual Communication of Data, O’Reilly Media, Inc., Newton, MA, USA, 2006.
- H. Tokola, C. Gröger, E. Järvenpää, and E. Niemi, “Designing manufacturing dashboards on the basis of a Key Performance Indicator survey,” Procedia CIRP, vol. 57, pp. 619–624, 2016.
- Y. Park and I.-H. Jo, “Factors that affect the success of learning analytics dashboards,” Educational Technology Research and Development, vol. 67, no. 6, pp. 1547–1571, 2019.
- G. McArdle and R. Kitchin, “The Dublin dashboard: design and development of a real-time analytical urban dashboard,” ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 19, p. 25, 2016.
- A. Dingli and F. Haddod, “Interacting with intelligent digital twins,” in International Conference on Human-Computer Interaction, pp. 3–15, Springer, Cham, Switzerland, July 2019.
- J. S. Jwo, C. S. Lin, C. H. Lee, and C. Wang, “A lightweight application for reading digital measurement and inputting condition assessment in manufacturing industry,” Mobile Information Systems, vol. 2021, Article ID 5555833, 2021.
- M. Farmanbar and C. Rong, “Triangulum city dashboard: an interactive data analytic platform for visualizing smart city performance,” Processes, vol. 8, no. 2, p. 250, 2020.
- A. Sarikaya, M. Correll, L. Bartram, M. Tory, and D. Fisher, “What do we talk about when we talk about dashboards?” IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 1, pp. 682–692, 2018.
- K. Thorlund, L. Dron, J. Park, G. Hsu, J. I. Forrest, and E. J. Mills, “A real-time dashboard of clinical trials for COVID-19,” The Lancet Digital Health, vol. 2, no. 6, pp. e286–e287, 2020.
- B. D. Wissel, P. J. Van Camp, M. Kouril et al., “An interactive online dashboard for tracking COVID-19 in U.S. counties, cities, and states in real time,” Journal of the American Medical Informatics Association, vol. 27, no. 7, pp. 1121–1125, 2020.
- J. Booth, B. Di Eugenio, I. F. Cruz, and O. Wolfson, “Robust natural language processing for urban trip planning,” Applied Artificial Intelligence, vol. 29, no. 9, pp. 859–903, 2015.
- S. Mennicken, R. Brillman, J. Thom, and H. Cramer, “Challenges and methods in design of domain-specific voice assistants,” in Proceedings of the 2018 AAAI Spring Symposium Series, Palo Alto, CA, USA, March 2018.
- Y. Zhang, W. Chan, and N. Jaitly, “Very deep convolutional networks for end-to-end speech recognition,” in Proceedings of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4845–4849, IEEE, New Orleans, LA, USA, March 2017.
- L. Lugosch, M. Ravanelli, P. Ignoto, V. S. Tomar, and Y. Bengio, “Speech model pre-training for end-to-end spoken language understanding,” in Proceedings of the Interspeech, pp. 814–818, Shanghai, China, October 2019.
- M. Chen, G. Lampouras, and A. Vlachos, “Sheffield at e2e: structured prediction approaches to end-to-end language generation,” E2E NLG Challenge System Descriptions, vol. 85, 2018.
- M. Wieland, P. Hirmer, F. Steimle et al., “Towards a rule-based manufacturing integration assistant,” Procedia CIRP, vol. 57, pp. 213–218, 2016.
- C. Sinsky, L. Colligan, L. Li et al., “Allocation of physician time in ambulatory practice: a time and motion study in 4 specialties,” Annals of Internal Medicine, vol. 165, no. 11, pp. 753–760, 2016.
- “Nuance AI-powered virtual assistants for healthcare,” 2021, https://www.nuance.com/healthcare/ambient-clinical-intelligence/virtual-assistants.html.
- “Robin healthcare,” 2021, https://www.robinhealthcare.com.
- U. U. Rehman, D. J. Chang, Y. Jung, U. Akhtar, M. A. Razzaq, and S. Lee, “Medical instructed real-time assistant for patient with glaucoma and diabetic conditions,” Applied Sciences, vol. 10, no. 7, p. 2216, 2020.
- Y. Jeong, A. Singh, M. Zafarzadeh, M. Wiktorsson, and J. B. Hauge, “Data-driven manufacturing simulation: towards a CPS-based approach,” in Swedish Production Symposium (SPS2020), Jönköping, Sweden, October 2020.
- K. Grevenitis, F. Psarommatis, A. Reina et al., “A hybrid framework for industrial data storage and exploitation,” Procedia CIRP, vol. 81, pp. 892–897, 2019.
- B. S. Nascimento, A. S. Vivacqua, and M. R. Borges, “A flexible architecture for selection and visualization of information in emergency situations,” in Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 3317–3322, IEEE, Budapest, Hungary, October 2016.
- M.-H. Tsai, H.-Y. Chan, and L.-Y. Liu, “Conversation-based school building inspection support system,” Applied Sciences, vol. 10, no. 11, p. 3739, 2020.
- F. Tao, Q. Qi, A. Liu, and A. Kusiak, “Data-driven smart manufacturing,” Journal of Manufacturing Systems, vol. 48, pp. 157–169, 2018.
Copyright © 2021 Jung-Sing Jwo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.