Abstract

In this paper, we consider a new discrete-time model that describes the spread of information by sharing in some kind of online environments such as Facebook, WhatsApp, and Twitter. The impact of sharing on the information amount is investigated, which is incorporated in the considered model as a supplement compartment. We consider the possible interactions between individuals and information on the Internet, such as posts, images, and videos. The theory of control is used to show the effectiveness of our optimal control strategy in reducing the information amount and sharers and then decreases the dissemination of false information that can lead to annoying situations and unstable state in the society. Numerical simulations are performed to investigate several scenarios before and after the use of our strategy of control. Furthermore, sensitivity analysis of the information amount on parameters is simulated and discussed.

1. Introduction

Facebook, Twitter, WhatsApp, Instagram, and others are often used as a useful tool to share information with a huge number of people in a very restricted time. People that use these platforms can share their feelings, news, and any kind of information, which can be true or false. In the case of unconfirmed information, we talk then about rumors [1]. Nowadays, one can find easily a big amount of information on the Internet, and sometimes it reaches us without looking for it. But it is very hard to identify the true ones. In some cases, information that comes from close people, like friends, family members, or virtual close friends has some sort of credibility. As a result, it can spread widely in seconds and pose risks to public opinion, which can have unpredictable and irreparable consequences.

The goal of disseminating information varies based on the objectives of their promoters and their ideas. Sometimes, the goal is commercial; one seeks to increase the demand for a particular product, as a marketing method, challenging the competitors of the company by distorting the facts, spreading an unrealistic fictional story, or making a story with a small part of reality, which can lead to a major source of crises for commercial organizations [24]. For instance, four rumors related to food products were identified in the Egyptian market [5]: the employ of the rotten meat in many famous restaurants, as well as the use of donkey meat in some meals; a video that shows damaged tomatoes in making sauce and ketchup; a video that shows damaged guava used in the juice industry; and many videos explaining the health damage caused by eating noodles prepared by a company.

Some of the companies involved reacted quickly to these rumors by publishing short videos explaining manufacturing processes and distributing documents confirming the quality and safety of their products. Nevertheless, some of them ignored the treatment of rumors and this was the trend of most restaurants. It is therefore difficult to determine to what extent the spread of rumors on social networks affects the behavior of consumers and to what extent companies are affected by these rumors.

These platforms can also be used to create disorders within communities and/or modern movements. The most prominent example in this context is the Arab Spring. Nine out of ten Egyptians and Tunisians indicated in a poll that they used Facebook to manage their protests and aware people. As a government intervention, social networks are blocked [6]. In contrast, most news companies and governments have also begun to embrace Facebook, Twitter, and other social platforms to communicate information, provide clarifications, and thus reduce anxiety [6]. For instance, the Moroccan government has issued a press release on its official Facebook page to calm the inflammatory situation caused by the publication and sharing of rumors on social networks about the spread of the flu H1N1 [7]. Therefore, the impact and consequences of the information spread should not be underestimated.

With the rapid development of mobile telephony and the Internet, the online social network is fundamentally different from the traditional social network. Dissemination and control of information, especially of public opinion, has become a focus and key element for public policymakers; this is a very urgent and useful topic for research on how to deal with such outbreaks.

Epidemiological models are considered an important tool in the modeling of dynamical phenomena. In 1927, Kermack and McKendrick devised the Susceptible-Infected-Removed (SIR) model which can be considered as an interesting contribution in the mathematical theory of epidemics [8]. In recent decades, epidemic modeling approaches have extended the field of epidemiology and various mathematical models have been designed to analyze the evolution of an epidemic according to population systems [913]. Such a kind of models divides the studied population (humans, animals, etc.) into different compartments representing individuals’ states [1417].

The spread of information is similar to the spread of diseases; consequently, the above models are widely considered. Laarabi et al. in their paper [14] investigated the spreading and the evolution of rumors; they proposed a delayed rumor propagation model. Xiong et al. in [18] proposed a new model called SCIR describing the dissemination of information. In [19], the authors proposed an online social network information spreading (OSIS) model combining epidemic models and individual’s cognitive psychology. Huo et al. [20] proposed an optimal control of the rumor spreading model with psychological factors and time delay. Notwithstanding the above, all these documents take into account continuous-time models and do not consider the amount of information as a model compartment.

We are motivated to explore some mathematical tools in the modeling of information spread on social media; accordingly, we investigate an optimal control approach to show if one is able to control such dissemination or not. To do this, we devise in this paper an information spreading discrete-time model combining the information amount on the Internet with a three-compartmental model, in which we incorporate the interaction between information and individuals, like posts, videos, and images. One reason for the upsurge of discrete epidemic models is that discrete models have advantages in describing outbreaks since data are usually collected in discrete-time units, which would make it more convenient to use discrete-time models and the ease with which data can be compared to the simulated results [15].

Therefore, the studied environment is divided as follows: Ignorant (I), Sharers (S), Removed (R), and the information amount (F). Here, we are more interested to investigate a new modeling and optimal control approaches, which aim to describe the impact of controls on the dissemination of information on the Internet, and which could also be applied in the case of rumors that are not necessarily confirmed and can spread very quickly and reach a huge number of people.

In the following sections, we begin by describing the different components of the suggested ISRF model for the information spread. In Section 3, we choose to minimize an objective function and we announce theorems of sufficient and necessary conditions and characterization of the sought optimal controls functions related to the clarifications and the suppression of information posts’ optimal control approach. Sought solutions (optimal controls) related to the chosen minimization criteria are simulated with their associated states in Section 4. Finally, we conclude the paper in Section 5.

2. Presentation of the Model

Information spread easily by any means, word of mouth, emails, phone calls, social networks, etc. With the help of every advanced technology that facilitates human communication, information is spreading rapidly. One of the most important factors of information spread is the introduction of the “Share” button, which accompanies any status update, link, video, or photo posted. It allows viewers of content (e.g., friends and followers of the creator) to share the publication. For example, on Facebook, if the content was originally published publicly, then it can be viewed and shared by everyone [1].

We devise here a compartmental model to study the dissemination of information in online environments of N users (Facebook, WhatsApp, or Twitter groups or pages) by posting, sharing, and discussing. In these online environments, when a user posts information (text, image, video, etc.), only his neighbors can see it and determine if that information needs to be shared again or not. If the information is so interesting and some neighbors decide to share it, neighbors of the author’s neighbors can then see and reshare them again. The influence of the information then exceeded the local scope of the author and can be widely distributed on the network. On the other hand, if none of the original author’s neighbors are attracted by this information, it will disappear soon and very few users will see it. When a user shares a post, the information is displayed on its homepage for a long time; even if he does not care about it anymore, all his neighbors can always see the information he has shared. At the same time, if the neighbors have seen the post and do not share it immediately, they may lose interest gradually and ignore that information.

If a user notices that some information is repeated and shared by several of his neighbors, then he will discuss it with his friends through the chat tools or face to face, so he can determine the relevance of this information and then decide to share it or not.

Our model consists of four compartments: Ignorant (I), Sharers (S), Removed people (R), and the amount of the information (F). The term “Ignorant” means a person that does not know about the information. The word “Sharer” is used to denote that a person is attracted by the information and/or he finds it funny or interesting; then he decides to share it. The term “Removed” means a person who has seen the post and has decided not to share it, because of irrelevance or for other personal reasons. In the case of disaster information, “Removed” ones are people who do not care about the matter. We kept the term Removed from the classical SIR epidemiological model to denote individuals removed from the sharing system. All transmissions are modeled using the mass action principle, which accounts for the probability of transmission in contact between the different compartments.

Each piece of information has the potential of sharing, but one can find some information not useful or does not fit the user interests, and then there is no need to share it. For example, if the information is about a concern of the public opinion (raising costs of education, election cheats, public safety, etc.), the probability of shares will be very important. Therefore, the potential relevance of the information will be taken into account and it will be defined based on the proportions of shares. But, before sharing information, people need to reach it.

We assume here that information can be reached in two ways. The first is the Profile-Profile Transmission (PPT) which can be any kind of person to person transmission. It can be also an offline conversation between a Sharer and an Ignorant. The second is the Profile-Information Transmission (PIT), which is the case when the information reaches the user without looking for it. It can be a post in social networks or some advertising signboards either on the Internet or in the real world.

Let us define the potential relevance of the information by the average by the PPT and the average by the PIT, while the potential irrelevance of the information is defined by the average by the PPT and by the PIT.

We assume that when information is shared, it is relevant. An Ignorant becomes a Sharer just after he shares the information by the PPT at the rate and by the PIT at a rate . An Ignorant that decides not to share the information by the PPT becomes Removed at a rate and by the PIT at a rate . A Sharer that contacts a Removed and he decides not to share the information anymore becomes a Removed at a rate . Information is posted on the Internet at a rate or shared by Ignorant after some contacts with the information at the rate . Also information is shared by Sharers at a rate , while posts, images, and videos can be deleted by users for some reason at a rate . Ignorant, Sharer, and Removed individuals can leave the environment at rates , , and , respectively. We assume that all new members are recruited as Ignorant at the constant rate . All these interactions happen at the instant i.

Without loss of generality, we assume that the number of new individuals is equal to the number of outgoing users; that is, .

The model resulting from these assumptions is governed by the following system:where , and for all i. For simplicity, we have put and and ; note that

A flow chart for the model is shown in Figure 1, and parameters’ description can be found in Table 1.

3. The Optimal Control Problem

3.1. Presentation of the Controls

To eradicate the dissemination of information, some governments prefer to block all communications and ban social media platforms [6]. But this strategy of control can lead to some protests or upsets. In seeking good interventions, governments began using social media to control the spread of some annoying information, by commenting on false information to correct it or on real ones to confirm it.

In February 2019, some infections and deaths associated with the H1N1 virus were recorded, making the media the main platform for disseminating this information and leaving Moroccan society in a state of confusion and worry. This prompted the Ministry of Health to intervene on February 5, 2019, to respond to news from the social networks on deaths due to seasonal influenza, by issuing a joint press release with the WHO country office at Morocco on its official Facebook page [7] so that residents do not worry about the spread of the seasonal H1N1 virus [21]. This communication between people and government institutions builds trust, leads to stability, and reduces anxiety.

By following this direction seeking for control strategies, we propose a control strategy using two types of controls; the first one () represents the effectiveness of comments and clarifications from official institutions such as government or any credible source, for example, the publication of the press release of the Moroccan Ministry of Health on its official Facebook page [7] and on its official website [21]. This control represents also the effect of documents and videos released in WhatsApp or on YouTube to aware people and/or to reveal the truth as the case of the companies in the Egyptian Market that reacted to rumors by publishing short videos explaining manufacturing processes and distributing documents confirming the quality and safety of their products [5].

The second control (u) is the complement of the first; it is the average of posts’ deleted and stopping sharING. In the situation of rumors, after the truth was revealed, people not only will refrain from sharing that rumor but will also comment on others’ posts by pointing to links to the truth, resulting in a lack of resharing and deleting posts. One thing that discourages resharing is when others comment on it by referring to external sources where the validity of the rumor is discussed. People who spread that rumor may try not to continue sharing it or stay away from these rumors if they know that they are wrong [1]. Sometimes, governments criminalize the resharing of some information on social networks due to the seriousness of the situation. In such situations, almost all group administrators delete all publications related to this information and block the publication or sharing of these contents on their pages.

Based on these facts, we introduce the two control variables u and such thatwhere , and for all i. For simplicity, we put and .

3.2. Objective Functional

The main objective of this optimal control strategy is to reduce the number of Sharers and increase the number of Removed with optimal costs of applying controls. Then, the problem is to minimize the objective functional given bywhere are the weight constants of Controls, the Sharers, and Removed, respectively, , , and is the final time of our strategy of control.

Our goal is to minimize Sharers, minimize the cost of applying controls, and increase the number of Removed individuals. In other words, we are seeking optimal controls and such thatwhere and are the control sets defined bysuch that

3.3. Sufficient Conditions

Theorem 1. There exists an optimal control such thatsubject to the control system (5)–(8) and initial conditions.

Proof. Since the parameters of the system are bounded and there are a finite number of time steps, that is, I, S, R, and F are uniformly bounded for all in the control set , thus is also bounded for all which implies that is finite, and there exists a sequence such thatand corresponding sequences of states are , , , and .
Since there are a finite number of uniformly bounded sequences, there exists and , , , and such that, on a sequence,Finally, due to the finite dimensional structure of system (5)–(8), the objective function , is an optimal control with corresponding states , , , and , which complete the proof.

3.4. Necessary Conditions

By using a discrete version of Pontryagin’s maximum principle [9, 15, 22, 23], we derive necessary conditions for our optimal controls. For this purpose, we define the Hamiltonian as

Theorem 2. Given optimal controls and solutions , and , there exists ; the adjoint variables satisfy the following equations:where are the transversality conditions. In addition,

Proof. Using the discrete version of Pontryagin’s maximum principle [22, 23], we obtain the following adjoint equations:with . To obtain the optimality conditions we take the variation with respect to controls ( and ) and set it equal to zeroThen we obtain the optimal control pairBy the bounds in and of the controls in definitions (12) and (13), it is easy to obtain and in the following form:

4. Simulation and Discussion

We now present numerical simulations associated with the above-mentioned optimal control problem. We write code in MATLAB™ and simulated our results using data from Table 2. The optimality systems are solved based on an iterative discrete scheme that converges following an appropriate test similar to the one related to the Forward-Backward Sweep Method (FBSM). The state system with an initial guess is solved forward in time and then the adjoint system is solved backward in time because of the transversality conditions. Afterward, we update the optimal control values using the values of state and costate variables obtained at the previous steps. Finally, we execute the previous steps until a tolerance criterion is reached.

In all the simulations below, the minutes were used as a time unit. The spread of information occurs faster than epidemics. We focus here on information that is more appealing and has the potential to be shared.

Without loss of generality, and as an example, we have chosen as a studied population, a group (in Facebook, Twitter, WhatsApp, etc.) with 1000 members that can be considered as the Ignorant group.

At the initial time , one Sharer is introduced to this group with information which can be a posted video or image or text.

All parameters of Table 2 are chosen to get a situation in which the number of Sharers rises above 90% of the population and the Removed group remains under 10%. In this situation, it can be shown that our proposed strategy of optimal control is very efficient to reduce the number of Sharers and the amount of the information while it increases the number of the Removed population with an optimal cost.

In Figure 2, it can be seen that about 5 hours from the injection of the information, there is no more Ignorant which means that the information reaches almost all the members of the group. We talk then about an explosion of the information. In the case of false information, this situation can lead to serious economic and/or political damages because it can be seen from this figure that the more the number of Sharers is big the more the amount of the information is huge. The proliferation of information cannot be stopped; consequently, it can spread out to external groups and reaches other spreaders in other places.

Figure 3 shows the dynamic of I, S, R, and F in model (5)–(8) with the two control functions u and , where it can be seen that the number of Sharers remains almost null, while the number of the Removed people rises about 100%. Making comments and explanations on social networks could be another way to ensure people’s safety in an emergency. More people reading official government comments, namely, correcting information or interpretations, will probably deal with the subject rationally. For example, after the posting of explanatory videos by the companies involved in the Egyptian market [5], people stopped sharing these rumors. As a result, these companies have succeeded in gaining consumer confidence again.

To achieve these optimal results by following our control strategy, we suggest using it in the first 24 hours of information appearing. When those concerned do not provide much information, people can be left feeling that they have something to hide, which leads to a lot of gossips. In the case of government rumors, some information can build trust and strengthen social stability.

It can be inferred that the control strategy that we propose is very effective and can be used to block such misinformation. Note that this control strategy not only has eradicated information, but has also blocked its proliferation.

Figure 4 depicts control functions u and defined in (23) and (24), respectively, resulting from the optimal control problem. It can be seen here that even if these optimal controls take small values, they lead to satisfying results. When the necessary clarifications are provided about information (Figure 4(b)), users discover that the information was incorrect or prohibited, so they suppress their resharing and abstain from publishing them again (Figure 3(b)). Despite that fact, the amount of information continues to grow, but at a very limited pace which can be safely ignored (Figure 3(b)), compared to the amount of information in the case when there are no controls (Figure 2(b)).

Figure 5 depicts the impact of parameter on the information amount. To do that, the function F is plotted as an output of two variables, time and parameter. From that figure it can be seen that either the controls are used or they are not; the higher the parameter, the smaller the amount of information.

This can be interpreted by the fact that some information cannot be shared, even if it has a big potential relevance and/or very interesting news, for example, personal information, horrible death of a celebrity, and death of Presidents. In these situations, people decide to forget such information, and then it can be considered as information with a huge parameter; therefore it will die out very quickly. For that reason, one can say that the amount of information is very sensitive to the information deleting parameter .

Figure 6 shows the sensitivity of information amount to the potential relevance parameter by the PIT, , where it can be seen that the amount of information F rises according to the values of ; the higher the values of , the greater the amount of information, whereas the information amount exceeds 3300 in the case of without controls, and it remains less than 20 when we use controls. Even when controls are used, information with a big potential relevance parameter by the PIT resists and continues to grow, but with a weak proliferation, which can be neglected (see Figure 6(a)).

Information with less than 0.03 can be easily eradicated in this control strategy, and the information amount did not exceed 20 with those greater than 0.03, while the information amount reaches more than 1000 when and more than 3000 with some bigger potential relevance parameter values in the absence of the optimal controls (see Figure 6(b)).

Information that has been published by more than one user can spread out very quickly; consequently, it can reach a massive audience. If such users keep this information published in their pages, then it will reach more neighbors over time. Accordingly, this information is considered as information with a big relevance parameter by the PIT, . For that reason, information with big values of relevance parameter represents the important information that has been published a long time. And so there is a need for our optimal control strategy.

One can see in Figure 7 the impact of the parameter on information amount. There is some form of stability in the shape of the amount of information when is too small, but we can see that even if has very small values of about 0.002, the amount of information reaches very big numbers about 7300 (see Figure 7(b)). The most important result of this paper is the one shown in Figure 7(a), where the insensitivity of the information amount to parameter can be seen. When controls are used, there are no changes in the shape of F even if has been changed. The amount of information does not exceed 10, whereas takes some large values. This proves the efficiency of our control strategy that makes the amount of information insensitive to the number of shares.

The impact of the potential relevance parameter by the PPT on the information amount is shown in Figure 8 where it can be seen that when takes some big values, then the information amount decreases (see Figure 8(b)). The amount of information increases by about 5300 when takes very small values below 0.01. And it stays under 1300 when exceeds 0.03. In comparison with the case when the control policy is followed, the amount of information keeps small values less than 10.

This can be interpreted by the fact that when people transmit information to each other via private messages or direct conversations, the amount of information on the Internet decreases and become nonmeasurable. As in Figure 7, it can be seen in Figure 8(a) that the information amount will be insensitive to the potential relevance parameter by the PPT when the optimal control strategy is followed. This means that if this control strategy is used, then the shares of information by the PPT can be neglected and it does not have any impact on the information amount.

The unexpected result can be seen in Figure 9. has a negative impact on the information amount when there are no controls. The higher the value of , the smaller the information amount which is a logical result, whereas information that has a big potential irrelevance parameter in the PPT vanishes quickly (see Figure 9(b)). People deliberately do not discuss unimportant information, either on the chat or in real life and this is what makes this information disappear over time without any control strategy. It can be seen in Figure 9(b) that the amount of information is less than 500, if is greater than 0.05. The amount of information exceeds 2300 if is less than 0.01.

When the optimal control strategy is used, unexpectedly, the impact of the potential irrelevance parameter on the information amount will change. The higher the values of , the bigger the amount of information; see Figure 9(a), where we can see that when takes values below 0.01, the amount of information is less than 20, whereas when some values above 0.05 are taken by , the information amount exceeds 300.

Unimportant information can cause more gossip if it is posted on official pages or if it is handled by responsible people. It is clear that this information will not be shared and will disappear without any intervention (see Figure 9(b)). But if we use any type of intervention, we will shed light on worthless information. As a result, it will reach more people.

5. Conclusion

In this paper, a new discrete-time model that describes the spread of information by shares in some kind of online environments such as Facebook, WhatsApp, and Twitter is considered. The impact of sharing information on the information amount and the relationship between the number of Sharers and the information amount is investigated. All possible interactions between individuals and information on the Internet are taken into account in the modeling process, such as posts, images, and videos. We defined two types of transmission of information, the PPT and the PIT; then we defined the potential relevance of information in each type of transmission. The theory of control is used to show the effectiveness of our optimal control strategy in reducing the information amount and Sharers to control the proliferation of false information that can lead to serious economic and/or political damages. We applied a discrete version of Pontryagin’s maximum principle to state the necessary conditions and characterization of our optimal controls. Based on the optimal control approach that we suggested here, we have exhibited also the impact of shares which reduces effectively the information amount. Finally, we have succeeded to show the effectiveness of our optimal control approach when it is applied, and then we proved that when we restrict the number of shares, we can keep a negligible information amount or with insignificant relevance. Sensitivity analysis of the information amount on parameters is simulated and discussed.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that they have no conflicts of interest.