Abstract

As a special commodity, medicine plays a vital role in people’s healthy life. The basic effects of drugs are excitation and inhibition. Under the action of drugs, any function that can enhance or increase the function of the body’s original tissues and organs is called excitement. On the contrary, any function that can weaken or reduce the function of the original tissues and organs of the body is called inhibition. The nature of action of drugs has three aspects: regulatory function; antipathogen and antitumor; and complementary therapy. If there is a problem with the quality and safety of medicines, it is tantamount to making money and killing people. Based on artificial intelligence, this paper analyzes the current situation and improvement strategies of quality management in pharmaceutical production management enterprises and proposes how to reduce the risk of drug safety with the assistance of artificial intelligence technology. The experimental results in this paper show that the sales of heparin sodium APIs were 2.099 billion yuan, accounting for 91.6% of the operating income in 2015, when company A had not conducted a drug risk assessment in 2015. After the outbreak of drug risks, the sales in 2016 were 1.743 billion yuan, accounting for 77.1% of the operating income in 2016. After the final implementation of the measures, the sales in 2019 were 4.743 billion yuan, accounting for 329.1% of the operating income in 2016. The research method in this paper can improve the hidden safety problems of drugs more efficiently, and it can improve the profit while ensuring the safety.

1. Introduction

As the concept of safe drug use has been deeply rooted in the hearts of the people, this paper puts forward higher standards and requirements for drug quality. It is necessary to achieve rational drug use and to use drugs effectively, safely, appropriately, and economically based on the systematic knowledge and theories of contemporary drugs and diseases. The purpose is to give full play to the efficacy of the drug and minimize the incidence of adverse reactions. These requirements prompt drug manufacturers to pay more attention to the quality management and control of drugs while continuously improving production efficiency. And for a large amount of data, computers have more powerful processing capabilities than humans. Comparing with the subjective judgment of the data, the computer processing will perform an objective data analysis according to the requirements. At the same time, it eliminates the confusion of the conclusions that the quality assurance personnel can reach due to the results of different degrees of precision caused by different experiences and states. In this way, an analysis report with high accuracy and stable analysis conclusion can be provided for decision makers.

Faced with the challenge of implementing the newly revised drug production quality management standards, the scope of quality assurance work is getting bigger and bigger, and the responsibilities are getting heavier and heavier. This shows that the quality assurance work is facing a huge test. Vigorously promoting the application of computer technology in quality assurance work and continuously improving the knowledge level of relevant practitioners are urgent problems to be solved in the current work. The lack of enterprise quality awareness is also a major hidden danger. First, quality management is a mere formality. It is just to meet the requirements of regulations and does not really understand the essence of GMP management. Second, when dealing with problematic products, a few pharmaceutical companies still have a fluke mentality and even ignore them directly. It is necessary to make full and effective use of a large amount of production information data generated in production, to mine the laws and associations hidden in these data, and to effectively associate deviation information with various data. This enables quality assurance personnel to find the best solution and analysis results when conducting system analysis.

This article mainly describes a detailed data analysis of the drug management system of company A under the introduction of artificial intelligence algorithms. The innovation is that this paper adopts the most advanced technology when the analysis of the personnel screening system is not clear enough and introduces artificial intelligence technology, which effectively reduces the error. The experimental results are also more realistic, and this article is constantly innovating and growing.

At present, the management of drug production has attracted much attention, and more and more scholars have carried out research on it. Among them, Selezneva et al.’s study was dedicated to the role of laboratory research in ensuring the quality of domestic medicines. It also reviewed and analyzed regulatory documents and current publications on the subject [1]. The purpose of Kashirina et al.’s study was to study the current industrial practice of drug quality risk management in Russian pharmaceutical companies. This includes assessing the main issues in implementing a risk management system and its compliance with accepted international methodologies [2]. Drugs are elements that inhibit the healthcare system. Logistics management starts with planning the process, purchasing, storing, dispensing, recording, and reporting medications. The purpose of Wulandari et al.’s study was to explore the logistics management of medicines in a pharmacy installation in the work area of the health office in the Kraben district [3]. Drug shortages faced by the US pharmaceutical industry and government in recent years have been a major challenge. Jia and Zhao addressed the problem of drug shortages from a supply chain perspective and a key missing piece in medicine, and he proposed to reduce shortages through drug purchase contracts [4].

Medication and drug optimization play an important role in modifiable physiological risk factors and NCD management. The purpose of Syed et al.’s study was to describe the number of prescriptions for type 2 diabetes (T2DM), hypertension, and hyperlipidemia. The findings of the study will provide the necessary information to inform pharmaceutical policy and practice [5]. Medications should be provided promptly to all patients, whether adults or children. The purpose of Ueyama et al.’s study was to study the current status and characteristics of pediatric drug development in Japan. They used information on the lag in the approval of pediatric directives between Japan and the European Union [6]. In recent years, patient-controlled analgesia (PCA) has been widely used in patients with various pains, with the continuous recognition of pain knowledge, and the continuous improvement of quality-of-life requirements. The mining technique proposed by Jin and Wu was used to analyze relevant literature. They tried to find out the main drugs of PCA, classify drugs, and mine important drug combination rules [7]. These articles are a good example of the importance of drug quality, but not whether it is based on artificial intelligence. These are not suitable for mainstream technologies in today’s society and have certain limitations.

3. Artificial Intelligence Algorithms

3.1. Artificial Intelligence

Artificial intelligence, whose English abbreviation is AI. It is a new technology science that studies and develops theories, methods, techniques, and applied systems for simulating, extending, and expanding human intelligence [8]. Artificial intelligence is a branch of computer science. It tries to understand the nature of intelligence and produce a new kind of intelligent machine. It can respond in a manner similar to human intelligence. Research in this area includes robotics, language recognition, image recognition, natural language processing, and expert systems. From the perspective of research direction, the current research directions in the field of artificial intelligence also include machine learning, knowledge representation, automatic reasoning, computer vision, and robotics. At present, in addition to machine learning (deep learning), natural language processing and computer vision are also hot [9]. Since the birth of artificial intelligence, the theory and technology have become more and more mature, and its application fields have been expanding. It is conceivable that the technological products brought by artificial intelligence in the future will be the “containers” of human intelligence [10]. Artificial intelligence can simulate the information process of human consciousness and thinking [11]. AI is not human intelligence, but it can think like a human and possibly surpass it. The structural framework of one stage is shown in Figure 1.

COM is the abbreviation of Component Object Model. COM is a new software development technology developed by Microsoft for the software production of the computer industry that is more in line with human behavior. This subject uses VC++6.0 to design an artificial intelligence software. At the same time, in order to be compatible with the FaultDoctor2.0 software developed by China Aerospace Measurement and Control Company and avoid troublesome mutual switching, the artificial intelligence software is packaged into a module through COM component technology. It can use data obtained in other ways to diagnose artificial intelligence software or directly use the artificial intelligence module in FaultDoctor 2.0 to diagnose and collect data from the test platform. Both methods incorporate artificial intelligence technologies such as wavelet analysis, information fusion, data mining, and neural networks, and the principles are exactly the same [12]. The schematic diagram of the overall design is shown in Figure 2.

3.2. AI Introduction Algorithm

In nature, flocks of birds, fish, bees, ants, and other social creatures show amazing efficiency in their activities. Their collective intelligence is far greater than the sum of their individual intelligences. Whether it is a fish, a bird, a bee, or an ant, the probability of wanting to survive alone is extremely low. Therefore, everyone needs to cooperate to complete the more complex work, so that the team can smoothly enter a good state of life.

Particle swarm optimization algorithm is an intelligent algorithm proposed by scientists in 1995 in the process of simulating the foraging process of bird groups, combining human cognition and other social behaviors [13]. Genetic algorithm is a method of searching for optimal solutions by simulating the natural evolution process. The algorithm uses computer simulation operation in a mathematical way. It converts the problem-solving process into a process similar to the crossover and mutation of chromosomal genes in biological evolution. At present, PSO has been used in many fields such as numerical function optimization, acoustic wave information processing, and antenna array design because of its simple algorithm and easy implementation. However, with the further development of research, various improved algorithms emerge in an endless stream. The improved algorithm broadens the application field of particle swarm optimization. It still attracts the attention and research of a wide range of scholars.

Particle swarm optimization is a random search algorithm. During the search process, a random point in the multidimensional search space is treated as a particle without size and mass, which is also called an individual. In the movement, each individual is guided by the optimal position passed by itself and the optimal position passed by the entire population. But the perturbation scale is randomly assigned in both guiding directions. D represents the dimension of the search space, and NW is the number of particles. Each particle individual is a D-dimensional vector , and i = 1, 2, 3, .... NP is  = (). The group updates the survey according to the following two formulas [14]: represents the number of iterations, t is used to record the current generation, t + 1 is used to record the next generation, is the acceleration constant, and [0, 1]. In order to prevent overflow, it needs to be bounded, as follows:

Binary encoding refers to the binary code language. It is a language that the computer can directly recognize without any translation. The instructions of each machine, its format, and the meaning represented by the code are rigidly stipulated. This paper studies the algorithm effect of using binary code in PSO and concludes that the particle swarm algorithm after binary code is much faster than genetic algorithm. It is particularly effective in higher-dimensional problems [15]. Velocity updates for elementary particle swarms are too dependent on previous velocities. This makes it easier for particles to fall into the constrained region of the local solution. In order to reduce the dependence on the previous speed, this paper adds an inertia weight factor to the previous speed, that is, the speed update formula:

The velocity update formula based on inertia factor is more in line with the physics inspired model of particle swarm. Through numerical experimental analysis, when  < 1.2, the algorithm has weak convergence but strong search performance. When  < 0.8, the algorithm has weak search performance but strong convergence. When , the algorithm can achieve a good balance between searchability and convergence. In complex optimization problems, due to the influence of linear descending inertia weights, the global search ability decreases in the later stage of evolution. Such complex problems can be solved by adaptively changing the weight coefficients. It sets the value to be an adaptive amount that decreases linearly with the evolutionary algebra and achieves good optimization results on four nonlinear functions. The method of shrinking factor is generally used in particle swarm optimization, not neural network. When Vmax is small (for Schaffer’s f6 function, Vmax = 3), it is better to use weight  = 0.8. If there is no Vmax information, using 0.8 as the weight is also a good choice. When the inertia weight w is small, the local search ability of the particle swarm algorithm is emphasized. When the inertia weight is large, it will focus on exerting the global search ability of the particle swarm algorithm. For complex problems, the method of shrinking factor is introduced in this paper. It ensures the convergence performance of particle swarm optimization by controlling the weight coefficient w and the control parameter 12cc [16]. The update formula of its speed is as follows:where is the shrinkage factor, and the calculation formula is as follows:

Scientists have proposed a new particle swarm algorithm. In order to maintain the diversity of the population, an “attraction” operator and a “repulsion” operator are introduced. The velocity update formula for the “repulsion” phase is as follows:

By defining the calculation formula of population diversity, the upper and lower bounds of diversity are calculated, and the local optimization ability and global optimization ability of the algorithm are dynamically adjusted with different speed update formulas [17]. This paper proposes a new speed update formula between the “attraction” operation and the “repulsion” operation as:

3.3. Introduction of Artificial Bee Colony Algorithm

Artificial bee colony algorithm (ABC) is another new swarm intelligence algorithm that can effectively solve numerical optimization problems. The excellent performance of artificial bee colony algorithm in high-dimensional numerical optimization problems and continuous problems has attracted much attention from scholars. For a long time, how to use artificial bee colony algorithm to solve discrete problems has become an upsurge in research. With its excellent character, artificial bee colony algorithm has achieved rich research results in the fields of integer programming, multi-objective optimization, neural network training, and so on. In major conferences and academic journals, the algorithm has been used to solve various complex optimization problems.

The artificial bee colony algorithm can be divided into four parts: initializing food source, leading bee mining, following bee mining, and scout bee replacing food source. The way to initialize the food source is as follows:where is a random number between (0, 1).

Leader bees and follower bees mine according to the following formula:where is a new solution generated near . , k, i are all randomly selected. k is a solution of the neighborhood of i, that is, k cannot be equal to a random number of i. A random number of controls the range of neighborhood generation [18].

The follower bee chooses the food source according to the swing dance of the lead bee. For the optimal minimum problem, the fitness value is generated according to the following formula:where is the jth objective function value [19]. The probability of each individual is given by:

This paper introduces the best method so far (thebest-so-farmethod). The best way is to introduce an improved version of the artificial bee colony algorithm, which is referred to as BSA. It finds that the best position so far in the method will yield the optimal solution. On this basis, this paper proposes to use this method to update the candidate solutions of the follower bees. This method can not only improve the local search ability of the algorithm but also make it search in the optimal direction and accelerate the convergence speed of the algorithm. The update formula of the solution is as follows:where is a random number on [−1, 1], is the fitness value of the best food source so far, and is the i-th dimension variable value of the best food source so far [20]. However, while speeding up the convergence speed, the algorithm tends to fall into local optimum. Therefore, this paper proposes a method of adaptively adjusting the search radius. It uses the following methods to update candidate solutions during the scout bee search phase:where is the abandoned food source solution, is a random number in [−1, 1], W is the maximum number of iterations, and t is the current number of iterations. represent the maximum and minimum proportions of the position adjustment of the scout bee, and its value is between 0.2–1.

This paper proposes that the initialization of the population has a great influence on the convergence speed of the algorithm and the quality of the final solution, and the random generation strategy is used in the basic algorithm. It has been known before that the initial conditions of the chaotic map have a nonnegligible effect on the chaotic sequence. Therefore, using the chaotic map to generate the initial population can increase the diversity of the population. The formula for improving the performance of the algorithm is as follows:where is a chaotic sequence. In order to further accelerate the convergence speed of the algorithm, it also introduces an antilearning strategy:

Finally, an optimal individual is selected as the initial population.

The population initialization of BSA is the same as the differential evolution algorithm. However, the population initialization of BSA includes the initialization of the population P and the initialization of the historical population oldp, as shown in the following formula:where denote the upper and lower bounds of the i-th dimension components, respectively, and E denotes the uniform distribution [21].

After the selection I, the individuals in oldq need to be randomly reordered and assigned to oldq. Then, it mutates:

In F = 3 × randn, randn is a random number that obeys the standard normal distribution. Through standardization, all random variables that obey the general normal distribution become standard normal distributions with a mean of 0 and a standard deviation of 1. For random variables that obey the standard normal distribution, it is specially denoted by z.

During the crossover operation, BSA randomly selects l elements from each individual in the parent population P. It swaps the same-dimensional elements of the colocated individual in the mutated Mutant to generate a new individual, and l is (0, D) in an integer. The crossover lengths are selected as follows:

BE is the crossover probability and D is the problem dimension.

The two improvements will be described separately below. It also attaches the results of the test functions: Multimodal (M), Non-Separable (N), Unimodal (U), and Separable (S) in Table 1 to illustrate the effect of the improvement.

The BSA variation scale coefficient was set to F. Table 1 is a comparison of the convergence effect of and F on the 30-dimensional Ackley function. When improving the scaling factor, it makes the scaling factor a random number obeying the Maxwell–Boltzmann distribution. It describes the distribution of ideal gas molecular velocities at thermal equilibrium. Extensive numerical experiments have shown that the Maxwell–Boltzmann distribution can perturb the population efficiently, making the mutation process look better in the experimental population [22]. It illustrates that the new varying scale factor has better search performance. The new scaling factor is as follows:where is a Chi-square distribution with three degrees of freedom, and bh3 is a random number obeying , Figure 3 is the normal distribution diagram of its convergence function.

From the convergence curve in Figure 3, it can be seen that the new variation scale coefficient has a fairly good convergence effect in the function test. It is worth noting, however, that the new distribution does not produce a negative scale of variation coefficient. Its magnitude is smaller. The searchability of the algorithm will be improved.where e is a random number between [0, 1]. In order to ensure that the first q individuals can be selected uniformly, this paper designs the following selection strategy to generate cgq [23].

It sets indp as the sequence code set of the first p individuals in the whole population. It randomly sorts the elements in indp k times, and stores it after each sorting. It finally merges the k sorts into an array ind. The new mutation population function generation is shown in Figure 4:

In this experiment, eight high-dimensional unimodal functions and eight high-dimensional multimodal functions were selected. The eight limited-dimensional multimodal functions are shown in Table 2.

The comparison results of the convergence time and convergence results of the improved bee colony algorithm, the standard bee colony algorithm, and the backtracking search algorithm are shown in Table 3.

From the results of the eight test functions, it can be seen that the convergence effect of IABC is better than that of ABC and BSA. This indicates that the improved hiring stage based on multidimensional and one-dimensional hybrid search strategy can overcome the disadvantage of slow convergence of hiring bees in this one-dimensional search. It shortens the search time and effectively speeds up the convergence speed. Even when solving high-dimensional problems such as griewank, the search efficiency of IABC is much higher than that of ABC and BSA. This is partly due to a more developed search strategy at the hiring bee stage. On the other hand, it is able to follow the effective selection strategy of the bee stage. It fully guarantees the diversity of the population and makes the algorithm less likely to fall into local optimum when the convergence speed is accelerated. The new search strategy and selection strategy not only speed up the convergence speed but also ensure the high stability of the algorithm. Finally, for the accuracy of the experimental data, this paper adopts an improved version of the ABC algorithm for statistical analysis of the data to ensure that the error is reduced to the greatest extent. For the accuracy of the experimental data, we will test the accuracy of these algorithms. In this paper, an improved version of the ABC algorithm is used for statistical analysis of data to ensure that errors are minimized.

4. Data Analysis of Drug Production Quality Management

4.1. Experimental Case

Company A is committed to the research and development, production, and sales of heparin series products. Years of development have enabled the company to gradually achieve a leading position in the field of domestic heparin APIs and heparin preparations. Not long after the new plant of company A was completed and put into operation, a major adjustment was made to personnel. It is in an initial run-in period in terms of personnel team building. Professional background, age, and work experience vary greatly. Most of the company’s personnel have poor quality awareness. Even from the production, warehousing, quality management, and other departments that are closely related to the quality of drugs. In addition, there have been cases where personnel specialists did not wash their hands, did not change clothes, did not wear foot covers and headgear to directly enter the production area to check posts, and workers who borrowed things between different production workshops did not wear clean clothes and entered the clean area without wearing clean clothes, which caused quality risks. The flow of personnel during the break-in period also brings difficulties to the improvement of the quality management system. The company’s quality management system is established in accordance with the requirements of international GMP on the basis of drawing lessons from advanced quality management experience at home and abroad. GMP is the abbreviation of good manufacturing practice in English. The World Health Organization defines GMP as regulations that guide the production and quality management of food, pharmaceuticals, and medical products. The file system in the system covers various contents such as verification, hygiene, materials, status identification, plant and facilities, organization, and personnel. There are more than 1,700 files in total, and it is difficult to upgrade the files because of the large number and wide-ranging content. There will also be omissions after the upgrade. The training of a new set of documents will have a delay period of learning and assimilation, and then there will be deviations. The file system design is more decentralized. Although it is detailed, there will be repetitions. When training on these written contents, it does not classify people of different qualities well [24].

Figure 5 is the main quality management structure diagram of a pharmaceutical company. A company’s quality management implements the general manager responsibility system, and its authorized responsible person is the quality director. It is specifically responsible for all quality management activities by the quality director. It is a separate establishment of the quality department. This is not included in the production department and other departments. The quality director is the deputy general manager of the company, who is in charge of the quality department, production department, laboratory, and other departments. It also incorporates the warehouse into the quality department, dispatches QA personnel to the warehouse to sample and release supplier materials, and supervises and guides warehouse administrators to store finished products. It generally implements the on-site “5S” management system of pharmaceutical production enterprises. The on-site “5S” management system of pharmaceutical production enterprises refers to sorting, rectifying, cleaning, cleaning, and literacy.

Now, the problem of the lack of management ability of some middle and senior leaders has also appeared. Although some of them have been working in the old factory for more than ten years, the accumulation in some aspects needs to be deeper. The quality department is an extremely important department for a manufacturing enterprise. They have the right to supervise and manage all activities and personnel related to product quality. This includes supervising SOP implementation and supervising the filling of records during production, supplier auditing and management, personnel training, change management, deviation management, CAPA management, QC, warehouse, verification, and so on. But some jobs are really not doing well enough. There are no good managers above, and there are no ordinary employees with higher skill levels and strong consciousness below. As a result, various quality management tasks cannot be carried out smoothly. This will reduce the implementation of GMP and cannot realize the continuous guidance of GMP thought to the work. The problems are particularly prominent in the aspects of change management, deviation management, training management, and supplier management, which cannot keep the product quality at a high level.

In the end, although there are no serious quality and safety accidents that endanger people’s lives for a variety of drugs of company A, there will be problems of one kind or another if the management is not standardized. Therefore, it cannot pass the US FDA certification and the European CEP/COS certification. A large part of heparin products is exported to obtain high sales volume. Then company A, which takes heparin as its main drug product, must have suffered huge profit losses. The export value of major heparin exporters is shown in Figure 6:

Hypuri went public in May 2010. The company’s main product is heparin sodium API. It is currently the largest manufacturer of heparin APIs in China. The “Hepalink” brand has become an international model for the quality of heparin sodium APIs, which is also the company’s core revenue source. In 2015, the company’s sales of heparin sodium raw materials were 2.099 billion yuan, accounting for 91.6% of the total operating income in 2015. The sales in 2016 were 1.743 billion yuan, accounting for 77.1% of the operating income in 2016. The sales in 2019 were 4.743 billion yuan, accounting for 99.8% of the operating income in 2019. Jianyou company is the second largest exporter of heparin APIs in China. In recent years, the company’s heparin API sales are about 400 million yuan [25]. Moreover, company A’s heparin sodium drug varieties have not obtained the US FDA certificate and EU CEP certificate and are not allowed to enter the US market and the European market for sale.

The new version of GMP clarifies the requirements for change control in Articles 241 and 242: it first establishes operating procedures, stipulating the application, evaluation, review, approval, and implementation of changes in packaging materials, production processes, facilities, instruments, workshops, raw and auxiliary materials, equipment, quality standards, and inspection methods. It is also designated by the quality department to be responsible for the control of changes. Enterprises should evaluate the potential impact of changes on product quality. It is mainly evaluated according to the scope and nature of the change. It categorizes changes according to the severity of the impact (type I changes, type II changes, and type III changes). Change can be said to be regarded as the main quality management system indicator in pharmaceutical manufacturing enterprises. The following article makes a horizontal comparison between the number of changes generated by company A from January to December 2015 and the number of changes generated by the other two companies during the same period to better illustrate the situation of company A. It is shown in Figure 7:

Through the comparative analysis of the above charts, it is known that the number of changes of company A in the statistical period has an upward trend. In one year, 98 changes were made, which has exceeded 75 control indicators of the whole plant. Comparing company B with company A, the number of companies in the same period is less than that of company A, which is at a more reasonable level. This shows that company A’s limited change management resources are difficult to effectively manage redundant changes. It believes that there are two problems in the change management of company A at this stage.(1)When a new change is proposed, such as the purchase of new production equipment, according to the requirements of GMP, it is necessary to evaluate the impact of the new equipment on the existing equipment and even the system. It requires detailed management procedures as a guide. However, the actual process and method have not been established, and the department that initiated the change does not know how to implement the change. They do not submit normative materials but simply explain the changes on the application form, which will be signed and reviewed by the person in charge of quality, and then easily approved. The whole process is overly reliant on the quality director. Change managers struggle to track the progress of change implementation. This results in the fact that the grasped situation is inconsistent with the actual situation, and there is a certain lag.(2)There is no scientific demonstration on what issues should be done when and to what extent. For example, it wants to improve the detection efficiency, save unnecessary work, and reduce the number of detections for intermediates. According to the testing situation in the first half year and the current production arrangement, it is stipulated that every five batches is a sampling testing cycle. For other batches of intermediates, the detection of the content item is omitted. For the five batches in the process of change, without the support of scientific argumentation reports, it is impossible to judge whether the detection frequency is reasonable [26].

4.2. Data Mining
4.2.1. Cluster Analysis Applied to Quality Assurance

Pharmaceutical manufacturers have higher requirements for data. This is directly related to the quality of drug production and more indirectly affects the safety of people’s lives. As a means of data mining, cluster analysis plays a vital role in pharmaceutical companies. Cluster analysis is the main task of exploratory data mining and a common technique for statistical data analysis. It is used in many fields including machine learning, pattern recognition, image analysis, information retrieval, bioinformatics, data compression, and computer graphics. Big data in all walks of life or any value discovery at the macro or micro level is the result of cluster analysis with the help of big data.

The experimental data from the laboratory statistics and the production data submitted by the workshop are submitted to the Quality Assurance Department. Quality assurance personnel should analyze these data and provide cluster analysis for linearly coherent data. They look for the invisible correlation between the data and use cluster analysis to classify the experimental data and production, so as to meet the needs of data sorting and risk analysis.

It uses cluster analysis to classify and divide experimental data. In this way, the dense and sparse regions of these experimental data can be identified, and the global distribution pattern of the experimental data can be found. The global distribution pattern of experimental data can provide the analyst with the necessary knowledge. In order to control the measurement error in production and test, it is often used to formulate upper and lower warning values and upper and lower control values according to the density of data distribution.

Taking an infusion enterprise as a simple example, the following figure is a regular chart of pH value and content when deviation occurs, as shown in Figure 8. From the figure, it can be clearly seen that when a certain risk deviation occurs, the data of pH value and content in the laboratory change. Using cluster analysis, this paper can cluster and distinguish the risk deviation data. It finds its central aggregation point and displays the pH value and content of the center. These precious data are directly related to the production situation and the quality of purchased raw and auxiliary materials, so it can be judged that the production situation and the quality of raw and auxiliary materials in this year are stable. It provides important information for redistributing liquid medicine to control its pH value and content.

4.2.2. Data Selection

The 188 sets of experimental data are divided according to the gradient. The principle of division is to start from the minimum value, and each 0.1 gradient is set as an interval, so there are 14 intervals from the minimum value of 99.5 to the maximum value of 100.9. It counts how many groups of content experimental data there are in each interval and calculates the percentage of the content data in this interval to the total data, as shown in Table 4.

4.2.3. Data Analysis

The statistical pie chart of the annual production situation can give decision makers the most intuitive impression. This plays a vital role for decision makers to grasp the production progress of the production workshop. Decision makers can effectively formulate work priorities and plans for the next year based on the support of production statistics provided by quality assurance personnel. It ensures that the products produced by pharmaceutical companies can meet market needs. It does not have a large backlog of inventory products nor does it make the market in short supply.

First, it analyzes the direct impact of time factors on production. By analyzing the annual production statistics linear statistics in Figure 9, it can be seen that for pharmaceutical manufacturers, the difference in temperature and humidity is an important factor in determining production, and the human body is also obvious for seasonal changes. It will have different seasonal diseases, so the market demand for different products is not the same. Only by grasping the correct production situation, can the production plan for the next year be effectively adjusted.

According to the analysis, the whole infusion workshop produces large quantities of saline 10 and 50 sugar 20. This means that the market popularity of these two products is high. The production of brine 10 is mainly concentrated in April, May, June, and July, while the production of sugar 50 and 20 is mainly concentrated in February, August, and December. The reason for the analysis is that this phenomenon occurs due to seasonal reasons. Therefore, decision makers can be advised to adjust the production plan for the increase or decrease of usage caused by seasonal changes.

The occurrence of deviation is directly related to the time factor. Different seasons cause different seasons, and different time periods are not the same for production adjustment. For example, the difference of temperature and humidity affects the spot inspection of production equipment, and the reproduction of insects in different seasons affects the frequency of insect and rodent control work. The quality assurance personnel should analyze the internal relationship between the time factor and the occurrence of deviation.

Analysis of the figure shows that there are many deviations caused by filling equipment each year. Therefore, it is recommended to carry out a comprehensive repair and maintenance of the filling equipment in the next year, thereby reducing the risk of deviation caused by the filling equipment.

At the same time, there were frequent deviations due to personnel errors at the end of the year. The reason for the analysis is due to excessive relaxation of personnel at the end of the year. Therefore, the training and supervision of personnel during this period should be strengthened to minimize or even avoid personnel errors.

According to the analysis, environmental deviation mainly occurred in spring and autumn. Spring is the breeding season for insect recovery, and autumn is also the reserve time for insects to lay eggs and prepare for winter. In these two seasons, the frequency of insect and rodent control should be strengthened. The hidden danger points that can breed insects in the workshop should be dealt with in time to prevent them from polluting the production products and affecting the product quality.

The histogram of the relationship between deviation types and products is shown in Figure 10.

From the above two histograms, it is not difficult to find that due to the large number of production batches, the deviation of 50 sugar 20 is more frequent than that of brine 10. The reason for this phenomenon is that the viscosity of the 50 sugar 20 product is high, which is a burden to the machine and the production environment. Moreover, the market demand for this kind of medicine is large, and the production pressure is also large, resulting in the concentrated appearance of deviations. In response to this phenomenon, the management of equipment and environment should be strengthened when producing 50 sugar 20, and a better method to deal with products with higher viscosity should be sought [27].

5. Discussion

The construction and improvement of drug quality management system is a long-term, comprehensive, and systematic project. The unsound and imperfect quality management greatly restricts the development of the enterprise. This paper makes a more detailed summary and analysis of some problems revealed after the comprehensive implementation of the GMP management system by a pharmaceutical company. It drives the change of employees’ concept and consciousness, clarifies the object of quality improvement, puts forward specific improvement measures, and completes the improvement plan design. The purpose is to form a practical and effective work system, which is guaranteed to be implemented smoothly, so that the company’s current quality management system can be optimized.

6. Conclusions

The improvement of the quality management system is based on the premise that the quality management problems are solved. In the process of checking omissions, filling vacancies, finding problems, and solving them, the research conclusions of this paper are drawn step by step:(1)There are commonalities among different quality management concepts(2)It is necessary to effectively reduce the deviation problem and deal with the change problem(3)It is necessary to efficiently complete production management and quality management work based on internal audit

Of course, the implementation of GMP in pharmaceutical production enterprises is only a basic level of work. Quality management in pharmaceutical production is a complex process. It needs to use the relevant information of the system, product, and process to overcome the weak links of the quality management system; to ensure its continuous improvement; and to obtain the stability and improvement of drug quality. It conducts reasonable analysis on data of different attributes and promotes the utilization of data by enterprises, so as to obtain more knowledge. The author also hopes that the experiment will make the display of the analysis results more perfect and strengthen the visualization effect processing. This enables analysts to get a neat report the first time, rather than a simple postanalysis graph.

Data Availability

No data were used to support this study.

Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this article.

Acknowledgments

This work was supported by 2020 Zhejiang philosophy and social science planning project (20NDJC221YB): Research on Collaborative Governance Path of High-quality Development of Drugs in Zhejiang Province under Big Data Strategy.