- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 838694, 15 pages
Proactive Communicating Process with Asymmetry in Multiagent Systems
1School of Computer Science and Technology, Tianjin University, Tianjin 300072, China
2School of Computer Software, Tianjin University, Tianjin 300072, China
Received 6 March 2013; Accepted 29 April 2013
Academic Editor: Xiaoyu Song
Copyright © 2013 Jiafang Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
This paper presents a formalized communicating process for dealing with information asymmetry between agents. A proactive process can improve the efficiency of dealing with asymmetry by allowing agents to take the initiative of communication in a goal-oriented way. In the process, by reasoning on belief and intention about the world and figuring out the information needed, the agent proactively requests information from another agent when asymmetry exists between them. Considering that agents may take advantage of information asymmetry by hiding information, the process also includes a model based on game theory to restrict the hiding behaviour. The work presented here not only introduces a definition of information asymmetry from cognitive perspective but also proposes a way to deal with it by communication in MAS. In addition, this paper presents some basic ideas on designing proactive mechanisms in cooperation between agents.
Information asymmetry exists when a party or parties possess greater informational awareness relative to other participating parties, and this information is pertinent to effective participation in a given situation . In Multiagent System (MAS), agents represent entities with different interests, so information asymmetry could bring benefits to some agents in team work and poor results for some other agents. This paper presents a formalized communicating process, where agents can deal with information asymmetry by reasoning on their knowledge about the world and figure out the information needed when facing asymmetry.
To deal with issues caused by information asymmetry, probability and statistical mechanisms are usually employed . Such mechanisms usually relay on history of interaction between agents. But in some situations, like at beginning of the interaction, such historical information is unavailable. From the cognitive view, if an agent can take proactive action to figure out what information is lacking in cooperation, the agent can request that information directly from agents who possess such information, and it can also take strategies to restrict information hiding (even cheating) by considering the context. Then information asymmetry can be solved in a proactive manner.
Proactive behaviour is considered as one of the key characteristics in software agents. Proactivity usually refers to the ability of agent to make conscious decision without being told to [3, 4], which means that agents will take actions to help each other according to some common goals without being instructed. On such willing to help assumption, researches for proactive behaviour usually ignore the issue of information asymmetry, such as share plan or joint intention [5–10]. However, we can treat “eliminating information asymmetry” as a common goal in teamwork and the communication process can then be modelled from the cognitive view.
Researches have also been conducted on proactive behaviour in human organisations including in the area of feedback seeking and issue selling [11–23]. These researches show that proactive communication between people is helpful to resolve the information asymmetry. From the view of information economics , researchers also have proposed several models to analyse the problem of how to get optimum contract under information asymmetry. In this paper, we intend to combine the work about proactive behaviour modelling in MAS, proactive communication, and game theory together and then to provide an efficient way of dealing with information asymmetry between agents in teamwork.
The work described here firstly introduces a formalized description about the communication process of dealing with information asymmetry from the cognitive point of view. Secondly, by combining a game theory based model with the communication process, information hiding is restricted according to context. Finally, the work proposed here provides some basic ideas of designing proactive communication process between agents. In a scenario of information asymmetry, the agent that needs information takes the initiative to identify the information and requests information from the agent that owns it. Such a proactive manner can be used to deal with some other problems in communication, such as trust establishment. During trust establishment the trustor can proactively collect information from the trustee, rather than just waiting to observe the behaviour of the trustee or waiting for information from the third party.
2. The Proactive Communicating Process for Dealing with Information Asymmetry
To facilitate the next discussion, a simple scenario of information asymmetry is introduced first. During the development of software, requirements of a customer change over time. Generally speaking, a customer is usually not familiar with the technologies, while the developer is not familiar with business requirements. Thus information asymmetry exists between the customer and the developer. The developer may take the advantage of information asymmetry to refuse a new requirement in order to gain unreasonable benefits.
In this section, the formal description of communication for dealing with information asymmetry in a proactive manner is presented. Information asymmetry and related processes are expressed with mental attitudes of agents. These mental attitudes are described with modal operators like Bel, Int. To/Th, Attempt, Inform, and Request, which are proposed in Joint intention, SharedPlans and the work of proactive information exchange [3–8, 24–26].
Sometimes information asymmetry exists in some scenarios which participants do not even realise. The process presented here focuses on how to deal with information asymmetry in a proactive manner. So we assume that communicating participants realise that information asymmetry exists in their cooperation. In the previous scenario, the customer should consider how to deal with information asymmetry proactively, in order to add the new requirement without paying unreasonable cost. Meanwhile the developer should consider how to deal with request of customer and take advantage of information asymmetry.
2.1. The Definition Information Asymmetry in the Communication Process
First different roles that two agents play in asymmetry are deserved to be discussed. Without special statement, this paper uses to present the agent that owns information and to present the agent short of information. Here information asymmetry means that for certain proposition , there is an agent that does not believe being true or false, and meanwhile there is another agent that does believe being true or false, or forms such belief by reasoning on its mental attitudes and knowledge base. Suppose that prop () and prop () are the set of propositions that and own in their mental attitudes and knowledge base, respectively, and Rules is the set of rules of two agents, and rules are all written with horn clauses. Then we define Rules () as the set of propositions appeared in the rules like or .
We assume that in cooperation needs to form the belief about proposition based on ’s belief on . Here is the agent short of information and is the agent with information. However, needs to get some information from to complete the reasoning process of forming belief about . If cannot provide all information needs, but needs to provide some information to help get information needed by , for these parts of information needed by , there is role exchange between and . then becomes the agent short of information and becomes the agent with information. Such a process can be described as in Figure 1.
Now we introduce the presentation of information asymmetry in the communication process. First, two participants of communication should be included in the representation of information asymmetry, as well as the role of each participant: who needs information and who provides.
Second, information asymmetry relates certain propositions which form intentions, beliefs, and other mental attitudes of agents. For instance, in the software scenario, the proposition in the representation of information asymmetry should be “the customer intends the developer to implement a new requirement.”
Third, information asymmetry exists under certain context. For instance, in the scenario, if the developer and the customer belong to the same company and the developer is subordinate of the customer, the developer should tell the customer the feasibility of a new requirement. Then it is not necessary to deal with information asymmetry in such situation.
With the previous discussion, information asymmetry can be presented as the following: ,, .
In the definition, and are agents involved in information asymmetry; _Role and _Role are the roles that two agents take in the asymmetry. For the sake of convenience, we use poor to represent 0, and rich to present 1.
inputVar (inputVar′) is the set of input variables of . outputVar (outputVar′) is the set of output variables of . is the context constraints of information asymmetry. The semantics of AsymInfo operator is illustrated with the following axiom.
Axiom 1. , ,. ,,.
The axiom says that at current time , if (or ) believes that information asymmetry about proposition exists between it and (or ), there must exist some proposition in Rules() (of each agent’s own) that believes being true but does not believe that believes being true, or believes being false but does not believe that believes being false. Table 1 lists the references about the Bel and MB operators. With this axiom, it can be assumed that has ability to provide with information related with asymmetry.
In the following section, the formal description of communication process is presented in detail, and many problems will be discussed. The notations to be used are listed in Table 1.
2.2. The Communication Process from the General View
In modern control theory, state space equation is a common tool to model and analyze dynamic characteristics of systems. A state space equation can be expressed as
The equation is composed of following components: a set of state variables to describe behaviours of system and a set of input variables and a set of output variables. These variables make up the state equation and the output equation. Consider that agents communicate with each other to deal with information asymmetry. Each agent has its own internal states composed by its mental attitudes. An agent sends some variables to another agent and requests for answers. These variables indicate what information needs. The target agent receives the variables and finally gives out answers with reasoning process. This situation is similar to state space equation discussed in control theory.
Based on the previous analysis, the process of dealing with information asymmetry can be described with following state space equations
Here, poor and rich in subscript of the equation represent two agents involved in asymmetry, with different roles. is a set which includes mental attitudes of an agent, like beliefs and intentions. and are sets of input and output variables, which correspond to inputVar and outputVar in the operator of AsymInfo. Here and mean that the states of agent are updated for the next round of communication. means that after poor gets the output variables from rich, input variables of poor are updated. In the equation, , and correspond to the reasoning process as follows.: For the agent that is short of information, represents the establishment of reasoning process for dealing with information asymmetry and identifying input variables in . : For the agent that is short of information, represents the process of identifying output variables in with input variables and internal state. : For the agent that is with information, represents the process of updating ’s internal states after receives input variables from : For the agent that is with information, represents the establishment of reasoning process for finding true values for input variables in : In the process of , represents the process of hiding information in the (such a process can be included in the process of . Here we use a separate operator in order to emphasize such a process).
Equation (2) shows that in communication process, input and output variables define what information needs to be exchanged between two agents and they are closely related to asymmetry between two agents. Definition 1 gives out the formal definition of input and output variables of both agents.
Definition 1. For agent that is short of information, input variable is defined as , ;for agent that is short of information, output variable is defined as , , ;for agent that is with information, output variable is defined as , , , .
Then input variables of are related to what information wants from and are defined with ’s belief about ’s belief about a proposition prop. Input variable (prop, unknown) means that does not believe that believes prop is true and also does not believe believes prop is false. Output variables of are some beliefs of its own that wants to tell in communication. These variables may help to get information that needs and they are helpful to avoid to request them from again.
As for input variables of , they are elements in union set of input and output variables sending by . For output variables of , they are beliefs of which are requested by as what defined as input variables. Output variable (prop, true) of means that believes prop is true, and output variable (prop, false) means that believes prop is false. Output variable (prop, unknown) means that does not believe prop is true or false.
The whole process is presented by Figure 2. First, agent short of information () finds out that information asymmetry would bring negative influence on cooperation between itself and agent with information (). First identifies some input variables related to information asymmetry. In the process of identifying input variables, a process of reasoning is established (the tree in second part of the Figure 2). The reasoning process and related mental attitudes of agent correspond to and in (2). At the same time, the output variables () with initial value are also identified (with input variables included). Then output variables are sent to to begin a communicating process to deal with information asymmetry (Rule 2 in Figure 2).
After output variables received, begins to construct its own state space. The input and the output variables of are constrained by the output and input variables of (as shown in (2)). Mental attitudes of are updated with input variables at first. In order to get the true_values of variables in , will start a reasoning process (the tree in third part of the Figure 3). After true_values of variables are gotten, puts these variables into the set of output variables and hides information as needed. Then output variables of are sent to (CommuResponse in the Figure 2), and analyzes whether true values of variables are hidden and updates its mental attitudes.
The information hiding during communication is constrained by the method based on game theory. can have a set of strategies by defining to hide a set of variable in its output. After gets inputVar from (output variables of ), it can have a set of strategies by judging each variable as hidden or not. can also define payoffs for strategies. If proper mechanisms for the game are designed, and can get equilibrium about their game on information hiding.
2.3. The Communication Process for Dealing with Information Asymmetry in Detail
2.3.1. Start of the Process
To facilitate introduction in following section, formal definitions of modal operators Attempt, Inform, and Request are listed as follows . The semantics of Inform and Request are given by choosing appropriate formulas to substitute in the definition of Attempt [3, 28]. Related operators and predications are listed in Table 1. Here we use “=” to present “defined as.”
Definition 2. , .
?; , , , .
?; , , , .
In the definition of , represents some ultimate goal that may or may not be achieved by the attempt and represents what it takes to make an honest effort. The definition of says that at current time , wants to believe that is true with event before time . The definition of says that at the current time , wants to execute with event before time .
With these definitions, the process of dealing with information asymmetry will be discussed in detail. The first question is how to start and who will initiate the process of communication. According to Axiom 1 in Section 2.1, two agents believe that information asymmetry about proposition exists between them. If both of them have reached an agreement on true value of , it may be unnecessary to deal with asymmetry between them. They just need to act as what they both agree. But when some conflicts about appear as Axiom 1 suggests that inconsistency between both agents’ belief exists, both agents should consider finding out the relation between conflicts and information asymmetry. And the process of dealing asymmetry should be taken into consideration.
Another question is who will initiate the process. It seems that both the agent short of information and the agent with information may be aware of conflicts caused by asymmetry, and both of them can start the process of dealing. But the agent who initiates the process should identify input variables, which constrains the choices of output variables of another agent and the reasoning process of both agents. This paper assumes that agent short of information () should initiate the process, as it can find out propositions whose true values it is not aware of in reasoning process. These propositions are candidates of input variables. When information asymmetry on exists between and , we define a rule to make sure that the agent with information () will inform about a contradiction of beliefs or intentions on proposition between it and .
Rule 1. , , , ; , .
Suppose that at time information asymmetry about proposition exists between and . Rule 1 includes two situations. First, at time if believes that intends that will believe is true at time , but at time , believes that it won’t believe is true at time , should intend to inform that will not believe is true at time . Second, if believes intends that will intend to do at some time before under context , but believes that it won’t intends to do under context at any time before , should intend to inform such belief. Inform should be finished before . is certain time after , and it can be defined by according to the requirement of concrete scenario.
In Rule 1 we use beliefs about and , because such beliefs can be gotten when informs about its intentions, but it is hard for to get beliefs like directly.
As described in part 1 of Figure 2, if uses Rule 1 to inform about some conflicts, can be aware of conflicts between and . Here assumption 1 is defined to show how forms belief of conflict between it and .
Assumption 3. believes that there exist conflicts between its intention that should perform some action or intend some proposition to hold and ’s unwillingness of performing or intending be hold as , , where , , , , ,.
prop1 stands for that intends proposition “ believes being hold at time ” being hold. prop2 stands for that intends proposition “at some time ” before , and believes that does not believe that holds at time ”. prop3 stands for that intends that “ intends to do at time ” being hold. prop4 stands for that intends proposition “at some time ” before , and believes that believes that will not intend to do at any time before ” being hold. Meta-predicate represents situations in which actions or propositions conflict with each other . Function constr () denotes the constraints components of the context .
Proof. As ’s belief about ’s intention is consistent with ’s intention, in Rule 1, if or ,
holds at current time , also has beliefs as follows: or .Then at some time , and ,
hold (time is decided by as necessary).
With Rule 1, forms the intention of Inform. If the performance of Inform is successful, with Definition 2 there exists some time that and reach a mutual belief like , or .With Assumption 3, gets a conflict between prop1 and prop3, or prop2 and prop4.
After is aware of conflicts, should consider initiating a process of dealing with information asymmetry in the following situations, as we define in Rule 2.
Rule 2. , , , , , ; , , , , , , , ; , , , , , , , ; , , , , , , , , .
Here, Int.Tx stands for Int.Th or Int.To. Rule 2 says that at time , suppose that believes that at time , intends that will believe that some proposition is true or will intend to do . At this time also believes that some proposition is true or it will intend to do . However, conflicts between and , as well as information asymmetry, exist between and in ’s opinion. Then at time , should form potential intention to execute CommuAct at time under context . includes and ’s belief about conflict.
Here we use potential intention because should reconcile intention on CommuAct with other intentions that has already adopted. We define CommuAct as follows.
Definition 5. , ; ; , , .
At time , execution of CommuAct means that before time , executes action ConstructSpaceP. If ConstructSpaceP is done successfully, requests to execute CommuResponse at and makes response before time . ConstructSpaceP is responsible for establishing reasoning process for dealing with information asymmetry. This definition will be discussed in detail later.
2.3.2. Dealing Process
In Definition 5, CommuAct is executed at time defined as an act like this: before agent executes ConstructSpaceP, which constructs state space according to the proposition and identifies input and output variables. If ConstructSpaceP is executed successfully, requests to execute CommuResponse at time , which should be finished before . inputVar is a set of input variables of , and outputVar is a set of output variables of . outputVar will be sent to with action . inputVar′ and outputVar′ are input and output variables of . says that when sends outputVar to with action Request, inputVar′ should include outputVar; when sends outputVar′ to with action CommuResponse, outputVar′ should include input variables. Request and CommuResponse can be implemented with Agent Communication Language.
First input and output variables need further discussion. As what we have discussed earlier, will hide information when it uses CommuResponse to send outputVar′ to . Then gets true value of , true or false, and information hiding can be defined as the following: , .Here we do not consider cheating between two agents, and we define cheating as follows: , .So we define an assumption about information hiding as follows.
Assumption 6. , , , , ; , , ; , , , , ; , , , .
The assumption says that after received the set of output variables, for each variable var in outputVar, if true_value of var is true (or false) and believes that no conflict will appear between ) and other propositions that believes being true or intends to (or intends that being true), will believe believes that is true (or false).
According to Definition 1, for each output variable (prop, true_value) in outputVar which is sent by to , (prop, true) stands for and (prop, false) stands for . For each output variable (prop, true_value) in outputVar′ which is sent by to , (prop, true) stands for and (prop, false) stands for . Take first part of Assumption 6 as example, it says that after receives output variables from , for each output variable (prop, true) believes that its beliefs have no conflict with chooses to believe that . In other words, holds. Here and stand for context constraints related with and , respectively. As for the situation that of var is unknown, it gets involved with information hiding, which will be discussed later.
Then the process of CommuAct will be presented in detail. According to Rule 2, a process of dealing with information asymmetry is initiated because conflicts occur between and . Such conflicts happen because in the process of deducing , beliefs and intentions of both agents have conflicts. Consider that has more information related to . Before gets the information that it needs, should establish reasoning trees about with its own mental attitudes and rules and find out beliefs and intentions in the tree that considers inconsistent with . These beliefs and intentions are candidates of input variables. And these reasoning trees are state spaces of for the process of dealing with information asymmetry. They also correspond to of poor in (2) (in Section 2.2). can also choose some beliefs and intentions from reasoning trees as output variables. In ’s opinion, these variables can help to get true values of input variables.
We assume that rules of and are written with horn clause. That is to say rules follow the schema like or . Then the reasoning tree can be presented with Figure 3. In a reasoning tree, each node is a proposition of the . Suppose that information asymmetry that is related to proposition prop exists between and . In the rule and , prop should be a belief of . For rule , prop is the root of the tree, and each is a child node of prop. For rule , is the root of the tree, and each is child node. Every node in the tree, except the leaf nodes, will have a rule like , and each proposition in the left part of the rule will be the child node of o. prop may have many reasoning trees at one time.
In the reasoning process presented in the previous tree, for the prop in the information asymmetry, some propositions in the tree of prop may be inconsistent with beliefs or intentions of . And the inconsistency of these propositions may hinder and in getting consistent result of prop. So in ’s opinion, ’s beliefs about these propositions are what needs. These propositions are candidates of input variables.
Assume that a conflict of proposition prop appears between and . intends that will believe at time , while believes that it won’t believe at . If has some rules like or , such rules can be used to establish the reasoning tree of the process of dealing with information asymmetry. For each in the rules, can also find out rules like and add them into the reasoning tree. Repeating with such a recursion process until no more rules added, the state space of the process of dealing with asymmetry is established. However, there may be rules like for in the tree. Although such rules do not appear in the reasoning tree, propositions in these rules can also be considered as input variables. can also choose some propositions, or even rules in the reasoning tree as output variables. Such a process can be implemented with backward chaining algorithm , and this paper takes ConstructSpace as a basic action here.
In the part 2 of Figure 2, if action CommuAct is successful, will be aware of the input variables of with Request in CommuAct.
Assumption 7. As propositions in the previous reasoning tree, propositions that fulfill the following conditions are taken as input variables and should be put into inputVar of : , .
The assumption says that for proposition in Rules(), believes that does not believe believes is true and also does not believe that believes is false, and then believes variable var = (, unknown) is input variable.
With ConstructSpaceP, will have input and output variables in inputVar and outputVar, respectively, with the state space being established. Then will request to execute act, CommuResponse, and expect for reply for each input variables.
Axiom 2. gets output variables of when believes that intends to execute CommuResponse: , , , = .
Axiom 2 says that when believes that intends that “at certain time before , intends to execute CommuResponse at time ,” gets input variables in inputVar and puts them into inputVar′. According to Rule 2, requests to execute CommuResponse and sends input variables with the request. If ’s request is received by successfully, is aware of ’s intention of expecting to intend to execute CommuResponse.
Theorem 8. In Rule 2, successful performance of CommuAct will make get the input variables of .
Proof. If the performance of CommuAct in Rule 2 is successful, as earlier introduced, the state space of dealing with information asymmetry is set up. With Assumption 7, gets input variables and puts them into inputVar.
As CommuAct is successfully accomplished, Request in CommuAct is also successful. According to the definition of Request, if is success, there exists a time such that holds, where , . Then according to the , .
In Definition 5, in the is actually CommuResponse, and and can get the following mutual beliefs at some time : , , , = , .With Axiom 2, finally gets input variables in inputVar and put them into inputVar′.
The mutual belief means that after request and both believe that at some time before , wants “ intends to execute CommuResponse at some time ” to hold before . The definition of CommuResponse is shown as Definition 9.
Definition 9. , , , , , .
Executing CommuResponse at time means that for each output variable var in outputVar′, informs about its belief in the proposition in var before time under context constraint . As for variables with true_value being unknown, after outputVar′ are sent to , they are still with true_value being unknown.
Before executes CommuResponse, there is still some work which needs to be done, including constructing a process of reasoning for input variables in inputVar′ and finding out true value for each input variable. Similar to , an act also needs to be defined for (Definition 10).
Definition 10. , ; ; , .
Definition 10 says that before agent constructs its state space for the process of dealing with information asymmetry with action ConstructSpaceR, then if ConstructSpaceR is successful, will execute CommuResponse at time , which should be finished before . input and output are sets of input and output variables of , respectively. The action ConstructSpaceR has some similarity to ConstructSpaceP and will be discussed later.
Rule 3 requires to form a potential intention on CommuRes after the request about CommuRes received from .
Rule 3. , , , , , , .
The rule says that at current time believes that some time earlier (at time ), intended that at time would intend to execute CommuResponse at time , and also believes that information asymmetry exists between itself and . Then should have a potential intention to execute action CommuRes at time . This time is a certain time after . It is defined by according to the concrete scenario.
If the Request for CommuResponse is successful, will have belief at some time as follows: ,, , .
Then if believes information asymmetry about exists between and , should have a potential intention on CommuRes.
Similar to action ConstructSpaceP of , uses action ConstructSpaceR in Definition 9 to establish the state space for the process of dealing with information asymmetry. ConstructSpaceR is mainly responsible for the following tasks: updates the mental attitudes with input variables in input, gets true values for each input variables, and puts these input variables in to output after the true_value of each input variables is got by reasoning. It is also responsible for deciding strategies of hiding for output variables.
First, input variables in inputVar′ from contain beliefs of about the proposition in each variables. ConstructSpaceR will update mental attitudes of with such beliefs. With Assumption 6, for each input variable (that is output variable of ) with its true_value being true (or false), believes that believes the proposition in the input variable being true (or false). In the process that updates its own mental attitudes with input variables, the problem of belief revision is involved. Many works have been done on this problem, and some algorithms have been proposed [30–33]. As belief revision is a complex topic, it will be discussed in our future work. ConstructSpaceR is also responsible for judging whether hides information in CommuAct. However, since is the agent short of information, it seems that is short of motivation to hide information in the process of requiring information from . For simplicity, this paper does not discuss the situation that hides information in communication with .
Second, after belief revision finished, finds out the true value for each input variable. The process is similar to ConstructSpaceP; for variables whose true values can not be gotten directly from beliefs of , the reasoning tree for each variable is established to see whether the true value of variables can be gotten from ’s own mental attitudes and rules. Backward chaining algorithm can also be employed here. For those variables which can not be gotten by the reasoning process, their true_value are set to be unknown. Then these input variables are put into outputVar′, waiting for sending back to .
In parts 3 and 4 of Figure 2, if action CommuRes is successful, will get responses of the input variables from .
Axiom 3. When believes that intends