Abstract

Cloud computing has provided services for users as a software paradigm. However, it is difficult to ensure privacy information security because of its opening, virtualization, and service outsourcing features. Therefore how to protect user privacy information has become a research focus. In this paper, firstly, we model service privacy policy and user privacy preference with description logic. Secondly, we use the pellet reasonor to verify the consistency and satisfiability, so as to detect the privacy conflict between services and user. Thirdly, we present the algorithm of detecting privacy conflict in the process of cloud service composition and prove the correctness and feasibility of this method by case study and experiment analysis. Our method can reduce the risk of user sensitive privacy information being illegally used and propagated by outsourcing services. In the meantime, the method avoids the exception in the process of service composition by the privacy conflict, and improves the trust degree of cloud service providers.

1. Introduction

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction [1]. With the character of service outsourcing, virtualization, distribution, and multitenancy, cloud computing has become a new computing paradigm and research focus. Such characters enhance the service quality and reduce the wastage of computing resources; for example, service outsourcing enhances the service capability and specialization through service composition [2]. Because of the transparency of privacy information to the outsourcing service provider, users worry that it will be hard to prevent user privacy data from being illegally propagated and used. For example, Google is sued by many users in America because of its new unified privacy policy implemented from March 1st, 2012. In Europe, the implementation of this new privacy policy has been investigated by European Union and postponed. According to the analysis by America Electronic Privacy Information Center, Google’s new privacy policies do not consider how to use privacy data in the product and to whom privacy data is propagated according to user privacy requirement and these policies may have conflicts with local laws. Therefore, privacy protection in cloud computing has become research focus in evolving computing paradigm.

Privacy was proposed as the human right to be let alone in the beginning [3]. In the domain of information system and software engineering, privacy protection means the capability of preventing individual information from being collected, disclosed, and stored by others [4]. The Platform for Privacy Preferences (P3P) [5] developed by World Wide Consortium (W3C) in 2002 provides a standard and machine-understandable privacy policy, which matches with user privacy preference. According to the matched results, user can select service that meets privacy preference. However, the described privacy requirement in P3P lacks semantic information and P3P only applies to web site, not supporting privacy protection in service composition. Therefore P3P cannot be applied in cloud computing since all entitles in cloud computing are service and provide service through service composition. In 2005 Extensible Access Control Markup Language (XACML) [6] is proposed by organization of the Advancement of Structured Information Standards (OASIS). XACML 2.0 [7] extends the support of privacy policy through profile for privacy policies. However, different users in cloud computing have different privacy requirements requiring different definition of sensitive privacy information. XACML privacy policies only apply to service provider without considering user privacy requirement and hardly guarantee the composite service satisfying user privacy requirement. Pearson [8] defined privacy protection in cloud computing as the capability of user to control personal sensitive information (PSI) without being collected, used, disclosed, and stored by cloud service provider. Pearson et al. [9, 10] proposed a conception of accountability that can create solutions to support users in deciding and tracking how their data is used by cloud service providers. They provided certain theoretical guidance but did not put forward specific solution about privacy protection in cloud computing. Roy et al. [11] and Bowers et al. [12] executed different privacy protection policy for data at different implementing stage in cloud computing. Moreover, privacy protection policy is integrated in services and privacy protection executor is service provider. Therefore, service provider is hardly arbitrated when user privacy information is illegally disclosed.

In cloud computing all entities are services. To satisfy and execute user privacy requirement properly, user privacy protection service must be provided by third party. Therefore, we propose a method of building service of privacy conflict detection in cloud computing. Suppose service document in cloud computing is described with OWL-S. In this paper, we firstly obtain input and precondition of service from service description document, model the input and precondition of service by taking advantage of TBox in description logic, and model user privacy preference by using ABox in description logic. In this way, we get the knowledge base. Then we reason the knowledge base with description logic, namely, taking advantage of Tableau algorithm to verify the consistency and satisfiability of the knowledge base, so as to detect the conflict between input and precondition of services and user privacy preferences policy. In this way we can ensure user privacy right and supervise privacy information propagating among outsourcing services. At last, we present the algorithm of detecting privacy conflict that supports semantics in cloud computing and prove the correctness and feasibility of this method by case study and experiment analysis.

We classify the related works of privacy protection into two parts which are computing process oriented privacy protection and data oriented privacy protection. The first part is classified into five smaller parts, which are model and verification of privacy requirement, matching and negotiation of privacy policy, and disclosure and risk. The second part is classified into three smaller parts, which are obfuscation, encryption, and anonymity of privacy data. In the meantime, we organize the related works into tables and compare them from contributions, applied computing paradigm, whether supporting service composition and whether supporting semantics. We highlight our work in the tables in detailed contents as shown in Table 1.

Since our work is focusing on privacy policy matching, we majorly discuss the related works of this theme. Other related works are organized into tables without further discussion. Barth et al. [17] defined user and service provider privacy policy, respectively, on the basis of analyzing current privacy rules and proposed a privacy policy automatic matching method, which can check the type of privacy data, the objective of privacy data disclosure, the collector, and maintenance period of privacy data. Wei et al. [18] researched privacy data protection policy in application of pervasive computing, built privacy model, and privacy policy axiom by using many-sorted logic and description logic and proposed a reason method of privacy policy which can check the inconsistency among policies.

3. Motivation

To explicitly clarify our research issue, we present an application scenario as follows.

Suppose Tom wants to buy commodity from seller through service in cloud computing; service requires Tom to input his sensitive privacy information, like real name, bank account, mobile phone number, and detailed address. Without negotiation with service for privacy agreement, Tom may worry about two aspects.(1)Privacy information may be illegally used or propagated by service or seller . Because of no privacy agreement, Tom cannot sue service or seller for recovering financial or spiritual losses. If Tom does not eagerly want this service, once he has privacy conflict with service provider, Tom will stop the service and select other services. Scenario is shown in Figure 1(a).(2)If Tom eagerly wanted to obtain the service, he would provide sensitive privacy information to service . However, privacy information is disclosed causing financial or spiritual losses. Scenario is shown in Figure 1(b).

In this paper, our motivation is to build a service, which automatically provides conflict detection of privacy for both user and service provider in cloud computing. Through this service, services satisfying user privacy requirement are discovered, so as to protect user privacy information without being illegally used and propagated.

4. Basic Theories

4.1. Description Logic Basis

Description logic is the basis of Ontology Web Language for Service (OWL-S), which is decidable subset of first-order logic and formalism for representing knowledge. Description logic is also called term logic, terminology knowledge representation language, concept language, and term representation language. Description logic is composed of concepts, roles, and individuals. Complex concepts and roles can be described by simple concepts and roles.

In this paper, we build a model of the privacy negotiation between service provider and user by taking advantage of description logic, transforming the privacy conflict issue to be decidable issue of Tableau algorithm. Supposing and are atomic concept, and are concept description, and are atomic formula, and represent individuals, and and represent atomic roles. Basic constructors include atomic negation , atomic intersection , value restriction , and limited existential quantification . This basic description logic is called ALC. All concept descriptions in ALC can be achieved through the following syntax rule:

All formulas in ALC can be obtained through the following atomic formula:

Syntax and semantics in (1) of ALC are shown in Table 2.

Tableau algorithm is an algorithm of detecting satisfiability among concepts in description logic. Since reasoning issue in description logic can be specified as satisfiability issue among concepts, most reasoners use Tableau algorithm, such as Pellet and Fact. Supposing that negative normal form of concept is , notation of each concept represents path of concept generated. The reasoning rule of Tableau algorithm is as follows.(1)Extension rule: supposing is atomic concept and , , , then .(2) rule: supposing , if , , then .(3) rule: supposing , if , , then .(4) rule: supposing , if , if does not have successor of that makes , then we add a note , value , and .(5) rule: supposing , if , if does not have successor of , and , then .

4.2. Privacy-Oriented Cloud Service Description Model

Compared to traditional web services, context semantic information is considered for services in cloud computing, which improves self-adaptive and self-management ability of service and increases intelligent level. Supposing the atomic service in cloud computing is described with OWL-S, outsourcing service in cloud computing description model is defined as below.

Definition 1 (outsourcing service metamodel). Outsourcing service metamodel can be expressed with 4 tubes, namely, Outsourcing Service Description = , in which ontology is basic terminology of service description, profile describes basic information about service, such as service name, service provider, service version, and QOS, privacy mainly describes privacy related information, such as input and precondition of service, capability describes the function of service including output and result. Privacy-oriented outsourcing service model is showed in Figure 2. In this paper, we mainly focus on the privacy related information. The other information is omitted; for example, detailed information of profile and capability is not shown in Figure 2. We can express outsourcing service metamodel as follows:
Outsourcing Service Metamodel (has Profile.Profile) (has Privacy.Privacy) (has Capability.Capability).

Definition 2 (privacy in the outsourcing service). Privacy can be expressed as 2 tubes, namely, Privacy , in which mapping is TBox. We can express it as follows:
Privacy has input (service-metamodel, input) has precondition (service-metamodel, precondition).

4.3. Service Trust Degree Metric

Definition 3 (trust degree ()). Trust degree is level of which service or service provider can be trusted. We can express it as , in which represents security, certificating the truth and integrity of data and trustworthy of QOS, represents capability of service or service provider to meet user security requirement, and Re represents reputation of service or service provider regarded by user [26]. In the meantime, , and are attributions of trust degree.
Security evaluation mainly evaluates if service has encryption, digital signature, or WSLA security, defined as follows:
In which represents that service has encryption, represents that service has function of digital signature, and represents that service has WSLA security.
Capability evaluation is defined through the frequency of user accessing service:
In which user represents those users who access service during the period of , represents counts that service be accessed in period of by user .
Reputation evaluation is evaluated by feedback from user and defined as follows:
In which represents user collection which evaluates service in period of , represents all evaluation information from user at time to , is time attenuation function while is attenuation factor, and is current time.
From formulas (3), (4), and (5), we can obtain the formula calculating service trust degree:
In which is weight of different trust degree attribution in service and is value of different trust degree attribution. Set as threshold of user expected service trust degree. If , user will accept all privacy attribution of service, no need for further privacy conflict detection by system, or else system has to detect privacy conflict to satisfy user privacy preference.

5. Privacy Conflict Detection

Definition 4 (sensitive degree). Sensitive privacy items are the items that set the level for privacy information according to user habit, scenario, and outsourcing service trust degree. Sensitive degree is a value of sensitive items. Therefore, user privacy information is classified as sensitive privacy information and nonsensitive privacy information on the basis of sensitive degree.

Definition 5 (user privacy preference). Constraint is expressed by user based on user privacy information sensitive degree and constraint is assertion that should be satisfied by outsourcing service. Assertion is represented by . User privacy preference is assertion collection and mapped into ABox. Namely,

Example 6. When customer Tom sends request to outsourcing service but trust degree of outsourcing service or the provider of service is equal or greater than threshold, namely, , under this condition, Tom discloses his real name and mobile phone as the service input or precondition. Constraint can be obtained by using privacy preference editor and can be expressed as follows: holdrealName holdmobilePhone .

Example 7. When customer Tom sends request to outsourcing service but trust degree of outsourcing service or the provider of service is less than threshold, namely, , under this condition, Tom will use nickname and office phone as service input or precondition, not willing to disclose community information and mobile phone. Constraint can be expressed as follows: holdaddressWithoutCommunity (, (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA)) holdOfficePhone (, +86-0258686866).

Definition 8 (privacy items). Outsourcing service requests user to disclose minimum privacy data collection. Namely, . From perspective of set theory, privacy items are subset of outsourcing service input and precondition. Namely, . In which is privacy data collection, is privacy data requested to be disclosed, and and , respectively, represent input and precondition of outsourcing services.

Definition 9 (matching between user privacy preference and privacy item). There are two kinds of results for the matching. Detailed results are shown as follows.
(1) All services in outsourcing service collection satisfy user privacy preference.
As corresponding privacy items collection of outsourcing services to be composed, is a programming that satisfies ABox for , namely, satisfying the following formula:
In which represents one outsourcing service in service collection to be composed, represents the matching relationship between privacy item and privacy preference constraint, and represents that all services satisfy user privacy preference; corresponding formula is .
(2) Some but not all services in outsourcing service collection satisfy user privacy preference.
As corresponding privacy item collection of outsourcing services to be composed, is a programming that partly satisfies ABox for , namely, satisfying the following formula:
In which represents that some but not all services satisfy user privacy preference; corresponding formula is .

5.1. Privacy Conflict Detection Algorithm

Suppose atomic service collection of service provider is , its corresponding privacy item collection is privacy Items , and is user privacy preference assertion. The process of privacy conflict detection is shown as follows.

In the process of service composition, firstly service input and precondition are obtained from service description document OWL-S, from which privacy items of service can also be obtained. Then keep detecting privacy conflict according to user privacy preference assertion , or extension of , until one service collection that satisfies user privacy preference assertion is found. If there is no service collection to satisfy , then service composition is stopped.

The first and the second line of Algorithm 1 are input and output, respectively. From the third to fifth line, respectively, initiate queue of service sequence to be privacy detected, queue of privacy item collection, and queue of service sequence that meet user privacy preference after detection. From the sixth line to the tenth line, enter service sequence to be privacy detected into queue and obtain trust degree value and privacy item collection of each atomic service successively. From eleventh line to twenty-first line, bind privacy item collection and trust degree value; then enter it into queue of privacy item collection and get head of queue successively detecting privacy conflict with Tableau algorithm; if there is no conflict, enter service into service queue that satisfies user privacy preference strategy, or else, rebind new service.

(1) Input: The description document of Service OWL-S
(2) Output: The service composition satisfied the privacy preferences policy
(3) Init Queue   ;
(4) Init Queue   ;
(5) Init Queue   ;
(6) EnQueue (Queue ( ), );
(7) while    (Queue ( ) != ϕ )  do
(8)  GetHead    (Queue   ( ), )
(9)   ;   //calculating the Trust Degree of service
(10)       Xpath       //obtaining the privacy items of service
(11)     EnQueue    (Queue   ( ), );
(12)     while    (Queue   ( ) != ϕ) do
(13)        GetHead    (Queue    ( ), );
(14)       ; //detecting privacy conflict between privacy preference and privacy items
(15)       if ( = true) do
(16)           EnQueue    (Queue   ( ), );
(17)       else
(18)           rebindingservice; //choosing a new service
(19)       end if
(20)     end while
(21) end while

5.2. Privacy Conflict Detection Framework

There are two layers for privacy conflict detection framework.

Privacy Conflict Predetection Layer. The part with slash background in Figure 3 represents privacy conflict predetection layer. This part mainly implements three functions as follows.(1)User privacy requirement is translated into privacy preference assertion by user privacy preference editor.(2)User comment information and Qos in service description document are evaluated by trust degree calculator, so as to obtain the trust degree value for services.(3)The input and precondition in service description document are captured by Xpath, and input and precondition are refined into privacy items.

At last, the privacy preference assertion, trust degree, and privacy items are saved into privacy conflict detection knowledge base.

Privacy Conflict Detection Layer. The part with grid background in Figure 3 represents privacy conflict detection layer.

Privacy conflict detection layer contains knowledge base and privacy conflict reasoner, in which knowledge base is made up of privacy preference assertion, trust degree, and privacy items. In this layer, privacy conflict detection for knowledge base is implemented by privacy conflict reasoner and the detection result is returned to user.

Therefore, framework of privacy conflict detection is showed in Figure 3.

6. Case Study and Experiment Analysis

6.1. Case Study

We prove the feasibility and effectiveness of our method by taking online purchase as an example. Firstly we assume the following points.(i)The less service required privacy items, the less probability of user privacy information to be disclosed.(ii)The less atomic service in service composition, the less scope of user privacy information to be propagated and the less risk of disclosure.

Therefore, in this example we make the payment terms to be cash on delivery to decrease the possibility of propagation or disclosure of user sensitive privacy information among atomic service provider, like Credit-Card-no., ID-Card-no., or Realname.

The online purchase service includes customer (Tom), cloud service composer (CSC), and three associate participants which are online purchase platform E-commerce service, seller (), and shipper. In Figure 4, we specially depict the foundation service of E-commerce service, like cryptographic service, operation system service, and infrastructure service. Name, address, postcode, and phone are customer personal privacy data. The purchase process is as follows.

When customer sends order request to seller through CSC and E-commerce Service, E-commerce service, seller, and shipper will send privacy data request to customer through CSC and the obtained privacy data will be worked as input and precondition. Once the privacy data is obtained, seller will send goods to customer through shipper. Shipper will collect the payment and return to seller. Considering that cloud computing has distributive character and all entities in cloud computing are service, we suppose that all privacy data are encrypted with cryptographic service before being transmitted to OS service and infrastructure service. Therefore, we just focus on the use and disclosure of privacy data in outsourcing service except OS service and infrastructure service. In this paper, we design a privacy conflict detection service between customer and CSC. This service will detect the conflict between the requested privacy data of each outsourcing service and customer privacy requirement and then send feedback to CSC and customer. Detailed process is showed in Figure 4 case of online shopping.

Based on Figure 4 we form TBox of privacy items:Ting Customer Privacy privacyOwnerCustomer Woman ManPrivacy ID-Card-Number Address Name PhoneprivacyOwner E-commerceService Seller Bank cloudServiceComposer ShipperName Realname NickNamerealName firstName secondName lastNamenickName Mr.Firstname Ms.FirstnameAddress Community Street City Province CountryAddressWithoutCommunity Address hasAddress. CommunityPhone Mobile officephone.

In this case, service composition participants include E-commerce service, seller, and shipper. Since E-commerce service, seller, and shipper own the same user privacy data in the business process, we will just discuss the privacy conflict detection for E-commerce. Detailed privacy conflict detection step is shown as follows.

First Step. Obtain privacy items of E-commerce service from OWL-S (outsourcing service description) and assign value to it.realName (Changbo Ke);nickName (Mr.Ke);Street (YUDAO STREET);City (NANJING);Province (JIANGSU);Country (CHINA);Community (MINGUGONG)officePhone (+86-0258686866) Mobile (+86-123456789);ZipCode (210016).

Second Step. Obtain the user privacy preference assertion from privacy conflict detection knowledge base. Namely, obtain the assertion in Abox.

holdrealName (, realName) holdAddress (, addressWithoutCommunity) holdofficePhone (, officePhone).

Third Step. Detect privacy conflict, by taking advantage of privacy conflict reasoner.(1)Extend the nonatomic concept AddressWithoutCommunity with extension rule: suppose is atomic concept and , , ; then .We can obtain that holdRealname.Name (Brobo) Address holdAddress. Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) holdOfficePhone (Brobo, +86-0258686866).(2)Extend nonatomic concept Address with extension rule again, and we can obtain thatholdRealname. Name (Brobo) Community Street City Province Country holdAddress. Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) holdOfficePhone (Brobo, +86-0258686866).(3)Take advantage of rule of Tableau algorithm; suppose ; if   and does not have successor of that makes , then add a node and assign value = and = . Simplify the above formula and we can obtain that Name (Ke Changbo) holdRealname (Brobo, Ke Changbo) Community Street City Province Country holdAddress. Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) holdOfficePhone (Brobo, +86-0258686866).(4)Take advantage of rule of Tableau algorithm; suppose ; if , while , then . Simplify the above formula and we can obtain that Name (Ke Changbo) holdRealname (Brobo, Ke Changbo) Street City Province Country holdOfficePhone (Brobo, +86-0258686866).(5)Take advantage of rule of Tableau algorithm; suppose (a) and is not blocked directly, (b) ; then Name (Ke Changbo), holdRealname (Brobo, Ke Changbo), Street, City, Province, Country, holdOfficePhone (Brobo, +86-0258686866).(6)Through simplifying the above formula we can obtain that Realname, Street, City, Province, Country, OfficePhone, substitute it with value of privacy items, Ke Changbo, YUDAO, NANJING, JIANGSU, CHINA, +86-0258686866, satisfying the formula , therefore no conflicts, satisfying user privacy preference assertion.

Therefore, the service composition sequence satisfies the formula , which also means satisfying user privacy preference.

6.2. Experiment Analysis

We build the ontology file “privacy-conflict-detection.owl” with Protégé, which is based on java language and developed by Stanford University. The conceptions and instants in the ontology are mapped into Tbox. Privacy preference assertions are defined with conceptions, items, and instants and are mapped into Abox. Tbox and Abox compose the privacy conflict detection knowledge base. We save the ontology file “privacy-conflict-detection.owl” to e disk test directory in local computer, then reason the ontology file with reasoner Pellet, which is developed by Mind Swap lab in University of Maryland. Pellet version number used in this experiment is V.2.3.0.

In ontology model, there are logical axioms 175 belonging to axioms 255, individuals 25, classes 33, object properties 21, and data properties 1, as shown in Figure 5

Firstly, we use command to detect the consistency of concept in ontology file. Command is pellet consistency e: testprivacy-conflict-detection.owl. Running result is showed as red box in Figure 5, namely, consistent. It means that privacy items of service providers satisfy semantic consistency.

Secondly, we use command to detect the satisfiability between ontology concept and logic axiom in ontology file, namely, whether the relationship among ontology concepts satisfies logic axiom. Running result shown as green box in Figure 5, namely, found no unclassifiable concepts. This result means that privacy concept, owned by privacyHolder in ontology file, meets user privacy preference assertion . PrivacyHolder in ontology file is also service provider in privacy conflict detection knowledge base. Therefore, result shows that there is no conflict between user privacy preference and service provider privacy policy.

7. Conclusions and Future Work

In this paper, we firstly obtain input and precondition of service from service description document OWL-S in cloud computing, model service privacy item, and user privacy preference by taking advantage of knowledge base, verify the decidability of knowledge base with Tableau algorithm, and detect the conflict between service privacy item, and user privacy preference, so as to enable user to choose service collection that meets user privacy preference. We also provide privacy conflict detection algorithm. Through case study we prove the feasibility and effectiveness of our method. Further work is to negotiate between user and service provider privacy item, so as to meet both user and service provider privacy requirement.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China 61272083 and 61262002, Funding for Outstanding Doctoral Dissertation in NUAA (Grant BCXJ12-14), and the Fundamental Research Funds for the Central Universities.