Research Article | Open Access
Confidentiality-Preserving Publicly Verifiable Computation Schemes for Polynomial Evaluation and Matrix-Vector Multiplication
With the development of cloud services, outsourcing computation tasks to a commercial cloud server has drawn attention of various communities, especially in the Big Data era. Public verifiability offers a flexible functionality in real circumstance where the cloud service provider (CSP) may be untrusted or some malicious users may slander the CSP on purpose. However, sometimes the computational result is sensitive and is supposed to remain undisclosed in the public verification phase, while existing works on publicly verifiable computation (PVC) fail to achieve this requirement. In this paper, we highlight the property of result confidentiality in publicly verifiable computation and present confidentiality-preserving public verifiable computation (CP-PVC) schemes for multivariate polynomial evaluation and matrix-vector multiplication, respectively. The proposed schemes work efficiently under the amortized model and, compared with previous PVC schemes for these computations, achieve confidentiality of computational results, while maintaining the property of public verifiability. The proposed schemes proved to be secure, efficient, and result-confidential. In addition, we provide the algorithms and experimental simulation to show the performance of the proposed schemes, which indicates that our proposal is also acceptable in practice.
Outsourcing computation has been served as a significant service with the rapid development of Cloud Computing Technology. It provides the service purchaser (whom we call user) with constraint computational power to delegate the complicated computational tasks to the service provider (which we call cloud server) and enjoy its unlimited computational resources in a pay-per-use manner. This brings a huge convenience for resource-constraint devices to reduce their computational overhead and thus has attracted significant interests in both industrial and academic communities. A number of large enterprise groups, such as Amazon, Google, and Alibaba, have launched their Cloud Computing to provide computation outsourcing services. What is more, in Big Data era, the ability to deal with the massive data has become core competitiveness while outsourcing computation just fits this demand.
While outsourcing computation paradigm enjoys numerous benefits, it also suffers from rigorous challenges . To begin with, since the cloud server is commercialized, sometimes it may not perform the computation honestly but output a computationally indistinguishable result in order to save its cost for more interests. Therefore, a basic requirement of outsourcing computation is to assure the correctness of the computational result. In other words, the user should have a way to verify the correctness of the output from the cloud server with an overwhelming probability. Despite the untrustworthy of cloud server, misbehavior may also happen from the user side. For example, a malicious user may deliberately claim the output from the cloud server incorrect and slander the cloud service provider by this even if the cloud server has performed the computation honestly. This is due to the fact that verification is done in a private manner. Therefore, it is preferable that the verification can be done publicly. That is to say, anyone except the user himself is able to verify the output from the cloud server. With public verification, not only cannot the cloud server cheat with an incorrect output, but also the user cannot claim the output from the cloud server incorrect for no reason, because now the output is witnessed and verified by everyone. Secondly, since sometimes the computational result is something sensitive, it needs to be kept secret to any party except the user himself. Thus another challenge of outsourcing computation is to assure confidentiality of the computational result, especially when the output from the cloud server can be verified publicly. Last but not least, the whole workload of the user in certain computation outsourcing procedure must be much less than accomplishing this computation task all by the user himself. We call this the requirement of efficiency. This is essential because if not, the outsourcing will be meaningless.
The evaluation of multivariate polynomials is one of the most fundamental computational tasks in scientific communities. In practice, there are so many problems that can be reduced to a model of evaluating certain polynomial with multivariate input value, for example, to evaluate an employee’s performance in a company and to evaluate a person’s health condition. Matrix-vector multiplication is another fundamental computational task that is widely applied, for example the Discrete Fourier Transform (DFT) and the Singular Value Decomposition (SVD). And in Big Data era, with the data we need to deal with getting more and more enormous, it is very likely that the storage requirement when evaluating multivariate polynomials or matrix-vector multiplication exceeds the available memory of the user’s computational devices, like cell phones or portable laptop. Thus we need to find another way to fulfill the computational tasks securely and efficiently. Plenty of works have been done to seek secure and efficient schemes of outsourcing computation for polynomials and matrix-vector multiplication. Fiore and Gennaro  proposed schemes to securely outsource evaluation of multivariate polynomials and matrix-vector multiplication and verify the corresponding result in a public manner. Unfortunately, one disadvantage of their proposal is the leakage of final result. Anyone is able to verify the correctness of the output from the cloud server and then obtain the result of the target evaluation. This brings a drawback in practice when the result is something sensitive, for example, the year-end bonus of an employee and the health condition of a person.
1.1. Related Works
1.1.1. Verifiable Computation
Verifiable computation (VC) was first proposed by Gennaro et al. . In VC, only two parties are involved, the client that processes the input data and the server that evaluates the target function with the value client sends. The output of the server can be verified by the client only. Both the input and output value of the function are private in the whole procedure. Gennaro et al. proposed a concrete VC scheme for arbitrary circuit using Yao’s  two-party computation scheme and Gentry’s  fully homomorphic encryption (FHE) scheme. After that, different VC schemes using FHE were proposed [6–8]. They made use of various techniques to achieve verifiability. However, applying FHE in practice brings expensive overhead.
1.1.2. Publicly Verifiable Computation
Different from VC where the verification is done privately, publicly verifiable computation (PVC) allows anyone to verify the result output by the server. PVC brings more flexible application in the untrusted cloud environment than VC, mainly in two-fold. One is to release the workloads of verification for the client. Another is the supervision of users. This is because if the verification is only done privately, once an incorrect result is claimed, it is hard to tell whether the server misbehaved or the user is intentionally slandering. However if the result can be verified publicly, the user’s slandering is easy to detect. PVC was first proposed by Parno et al. . They constructed a PVC scheme for Boolean functions using KP-ABE schemes. After that, many PVC schemes were proposed [2, 17–19]. In 2012, Fiore et al.  proposed a PVC scheme for multivariate polynomial evaluation and matrix multiplication. They took inspiration from Benabbas et al.’s  VC scheme to use pseudorandom function that enjoys closed form efficiency to generate the verification key efficiently, and they generalized the function to multivariate case. Moreover, by leveraging the technique of bilinear map, they have improved the verification procedure from private to public manner. With the similar technique, Sun et al.  constructed batch verifiable computation schemes with public verification for polynomial and matrix that achieve simultaneously evaluation of multiple functions in one outsourcing phase. In 2016, Elkhiyaoui et al. proposed another solution for univariate polynomial evaluation and matrix multiplication. They leverage the idea of Euclidean division of polynomials to construct the structure of the verifiable computation. And the bilinear map technique is utilized to make the verification able to be public. However, this idea is only suitable for univariate polynomial scenario. What is more, all the schemes mentioned above share the same disadvantage that anyone except for the user that verifies the result will surely obtain its concrete value. This is insecure in practice when the result of certain computation is usually something sensitive and the verification process is supposed to output a judgement to the correctness of the result rather than disclosing the value itself. To overcome this, Alderman et al.  improved Parno et al.’s  scheme with a secret substitution bit and presented a PVC with key distribution center (KDC) for Boolean functions. They also achieved other properties like revocation based on this PVC with KDC [21, 22], but this way of using secret substitution bit cannot fit other functions that have large range.
1.2. Our Contributions
In this paper, we present a modified PVC model that is considered to be more practical. It captures the confidentiality of the computational result, which we believe is an important property when utilizing in practice.
We present outsourcing schemes for securely and efficiently evaluating high degree multivariate polynomials and matrix-vector multiplication. Compared with existing outsourcing schemes for polynomials [2, 23] and matrix multiplication, our proposal simultaneously captures properties of both public verifiability and result confidentiality. This offers a more flexible application in practice.
We also provide the algorithm for our outsourcing scheme and run some simulated experiments to show the efficiency of the proposed schemes.
This paper is an extension of its corresponding conference version . In the revised version, we extends the CP-PVC scheme to the matrix-vector multiplication case. The resulting scheme proved to achieve not only the properties of polynomial case (i.e., security, public verifiability, and result confidentiality) but also the input privacy. We run corresponding simulated experiments and the result is also acceptable for practice.
1.3. Paper Organization
The remaining parts of the paper are organized as follows. Some necessary preliminaries are provided for the proposed schemes in Section 2. The framework of the proposed CP-PVC protocol is defined in Section 3. The concrete constructions of the CP-PVC schemes for polynomial evaluation and matrix-vector multiplication are presented, analyzed, and simultaneously experimented separately in Sections 4 and 5. The conclusion of the paper is in Section 6.
In this section, we provide some definitions about algebraic pseudorandom function (PRF) with closed form efficiency, bilinear Map, and some related notions. We also provide the computational assumptions that are used for the construction of our schemes.
2.1. Algebraic PRF with Closed Form Efficiency
One of our main techniques is the PRF with closed form efficiency. PRF is a function (denoted by ) that is generated from a secret seed . It owns the properties of both randomness and computational efficiency. A closed form efficient PRF consists of algorithms (KG, ) that are defined as follows:(i)KG: The randomized key generation algorithm takes as input the security parameter and outputs a tuple of parameters , where denotes the secret seed and denotes the public parameters that specifies the domain and range of the function, respectively.(ii): The deterministic functional computation algorithm takes as input the secret seed and the value and computes a value . We usually denote it by .
An algebraic PRF with closed form efficiency must satisfies the following properties:(Algebraic) A PRF is algebraic if the range of forms an abelian group. We use multiplication notation for group operation.(Pseudorandom) A PRF is pseudorandom if for every PPT adversary , there holds where and is a random function.(Closed form efficiency) Let Comp represent arbitrary computation that takes as input random values and a vector of arbitrary values . Assume that the fastest algorithm to compute Comp takes time . Let be a tuple of arbitrary values taken from . Then a PRF is closed form efficient for (Comp, ) if there exists an algorithm such that and its running time is . When , we usually omit it from the subscript and write instead.
Here we only show the definition of PRF with closed form efficiency. We will give a concrete algorithm of PRF with closed form efficiency for multivariate polynomials in Section 3.
2.2. Bilinear Map
Our constructions also use bilinear maps. Bilinear pairing is a powerful tool in noninteractive authentication and has been widely applied in both encryption and signature schemes [9, 25]. To be specific, let , and be finite cyclic multiplicative groups of order , and let be generators of and . A map is called a bilinear map if it satisfies the following properties:Bilinearity: it holds that for all .Nondegeneracy: there exist such that .Computability: there exists an efficient algorithm to compute for any .
2.3. Computational Assumptions
The computational assumption that is used for the construction of the PRF with closed form efficiency is decision linear (DL) assumption. We present the definition below.
Definition 1 (decision linear assumption). Let be a group of prime order . Given and , one defines the advantage of an algorithm in deciding the decision linear problem in asOne says that the decision linear assumption holds in if for every -time algorithm one has .
Note that the decision linear assumption holds in generic bilinear groups. Relative proof can be found in .
Next we present the definition of co-CDH, which is the base for the security of our proposed schemes. The co-CDH assumption was first introduced in BLS signature scheme presented by Boneh et al. , as a natural extension of standard CDH problem in asymmetric bilinear pairing. It is defined as follows.
Definition 2 (co-CDH assumption). Let be as above in Section 2.2. Given random , one defines the advantage of an algorithm in solving the co-CDH problem in asand one says that the co-CDH assumption holds in if for every -time algorithm one has .
Note that when , the co-CDH problem reduces to standard CDH.
3. Modelling CP-PVC
We use an amortized model  to construct our CP-PVC scheme. That is, the user (denoted by S) shall invest a larger amount of computational work in a preprocessing phase in order to obtain efficiency during the computation outsourcing phase. The adversaries in a PVC protocol are two types, the cloud server (denoted by ) and some “curious” verifiers (denoted by ). The former is in lazy-but-honest model  and the latter is in honest-but-curious model. This is reasonable since, in practice, a rational commercial cloud service will try to minimize the computation it needs to do to pass the verification algorithm. And passing the verification algorithm is its priority because only in this way can it get the payback. Also since the verification is public, there will be some curious verifiers that perform the public verification algorithm and try to discover some secret information about the final result value.
A difference between the framework of PVC proposed by Parno et al.  and ours is that we address the confidentiality of the computational result. In the public verification phase, a bit is output instead of the result value. And the result value is obtained in the later phase called private retrieval. To realize confidentiality, the user needs to operate the target function in the preprocessing phase and obtain a secret key for retrieval and keep it secret.
Let be a class of functions and . We define a confidentiality-preserving publicly verifiable computation (CP-PVC) protocol via the following five algorithms:(i)KeyGen: The randomized key generation algorithm takes as input a security parameter and the function and as outputs a secret key for the input delegation phase, an evaluation key for the cloud server to compute the outsourced message, and the public parameter . This is done by the client.(ii)ProbGen: Given the public parameter , the secret key , and the input value , the randomized problem generation algorithm outputs a public value , which is the encoding of , together with a public verification key for nonclient parties to verify the correctness and a private retrieval key for the client to retrieve final result . This is done by the client.(iii)Compute: On inputting the evaluation key together with the value , the randomized computation algorithm outputs a value . This is done by the worker (cloud server).(iv)PubVer: The deterministic public verification algorithm uses the public parameter and public verification key to check whether the final result is correct and returns or accordingly. This is done by the nonclient verifiers.(v)PrivRet: The deterministic private retrieval algorithm is run on input , ,, and to compute a string . Here, the special symbol indicates that the public verification algorithm rejects the worker’s answer . This is done by the client.
A verifiable computation scheme should be both correct and secure. We give the definition of correctness and security in the following.
Definition 3 (correctness). A confidentiality-preserving publicly verifiable computation protocol is correct for a class of functions if, for any from , any tuple output by KeyGen, any chosen from Domain, any tuple output by ProbGen, and any output by Compute, the PubVer algorithm on input () outputs , and the PrivRet algorithm on input , and outputs .
The security of a verifiable computation requires that the worker is not able to output an incorrect value that passes the PubVer or the PrivRet algorithm. We give the formal definition via the following experiment.
Definition 4 (security). Let be a confidentiality-preserving publicly verifiable computation scheme for a class of functions , and assume that is PPT adversaries. Consider Experiment for any below: ; to , ; ; ; ; ; ; ; A confidentiality-preserving publicly verifiable computation scheme is secure for a class of functions , if, for any from and any PPT adversary , it holds thatHere represents a negligible function in .
Next we give the confidentiality definition of CP-PVC which is not defined in existing PVC frameworks [2, 16]. In this paper we focus on the confidentiality for final result, which means that the adversaries cannot learn any information about the value from the value output by Compute algorithm. Here the adversaries refer to the cloud server and any nonclient verifier. Since the cloud server has extra knowledge of the evaluation key compared with the nonclient verifiers, we only need to define the result confidentiality to cloud server. And the confidentiality to cloud server implicitly implies the confidentiality to nonclient verifiers. Notice that we do not emphasize the input privacy as a necessity in publicly verifiable computation. This is because, in some scenarios, input data is obtained from some public sources that can be accessed by anyone. However, we still present a loose definition on input privacy for multivariate function, which we call -privacy. Intuitively, it means that, for a function with an input set of multi-independent variables, the probability that the adversary leans the values of a fraction of the input sets is . The definitions are as follows.
Definition 5 (result confidentiality). A confidentiality-preserving publicly verifiable computation protocol is result-confidential for a class of functions if, for any from , any tuple output by KeyGen, any chosen from Domain, any tuple output by ProbGen, any output by Compute, and any PPT adversary , it holds that
Definition 6 ( input privacy). A confidentiality-preserving publicly verifiable computation protocol for a class of multivariate functions achieves input privacy if, for any from , any tuple output by KeyGen, any input set chosen from Domain, any tuple output by ProbGen, any output by Compute, and any PPT adversary , it holds that
Finally, we give the definition of efficiency. Informally speaking, efficiency means that the total computational cost on the client side by engaging the CP-PVC scheme is less than that of executing the direct algorithm to compute the target function. In the amortized model, since the KeyGen is done once and amortized by multiple function evaluation with different input value, this part of computational overhead does not need to be counted in.
Definition 7 (efficiency). A confidentiality-preserving publicly verifiable computation protocol for a class of multivariate functions is efficient if, for any from and any chosen from Domain, the total computational cost of algorithms ProbGen and PrivRet is less than that of directly evaluating on .
4. The CP-PVC Scheme for Polynomial Evaluation
In this section, we first review the construction of PRF in , showing that it is closed form efficient for polynomials in variables and degree at most in each variable. Then we present the corresponding algorithm for evaluating the PRF and its closed form efficiency. After that, we give the concrete construction of our CP-PVC scheme for polynomial evaluation together with the analysis and experimental simulation.
4.1. Algorithm for PRF with Closed Form Efficiency
Let be a group generator that takes as input a secure parameter and outputs a description of group with prime order. Consider any polynomial that has variables and degree at most in each variable. Then the polynomial has totally monomials. Index them with tuple , . We say that the construction of admits the closed form efficiency for the following computation:where is the polynomial whose coefficients are the discrete logs of the values. If we set , then there exists an algorithm that can computein time , instead of the regular running time .
The proof of the above claim can be found in . Here we show the algorithm for evaluating the PRF as well as the polynomial .
Let . The construction of PRF is the following algorithm:(i) KG: Run to generate a group description . Choose random values
The algorithm outputs
The domain of the function is , and the range is .(ii): Let be the input of the PRF. First interpret each as a binary string of bits. Then run Algorithm 1.
Finally, the value of the PRF is . Let be the input; the polynomial can be written as
Finally, the value of . Note that the above algorithm makes totally times recursive operation. Thus its running time is , much faster than the regular running time .
With the use of the above closed form efficient PRF in Algorithm 2, we can realize public verifiability by letting the PRF value be a part of the verification key. Then our remaining goal is to make this public verification process “blind”. Intuitively, to make the value confidential, one way is to encrypt the input value . This is usually done in verifiable computation schemes. However such operation requires the homomorphic property of the encryption scheme, and encryption schemes with better homomorphic property are usually less efficient. Therefore, here we consider the way to blind the target polynomial . We use two sparse -variate polynomials and to randomize the target polynomial. Here sparse means that the degree of the polynomial is 1 and there are at most terms of monomials that are nonzero. Thus the value of is covered under and , and the computational overhead of and is .
4.2. Construction of CP-PVC for Polynomial Case
Now, we present our concrete scheme of CP-PVC for multivariate polynomials with variables and degree at most of each variable. In the case, has totally at most terms of monomial. Assume . Let the degree of each variable be at most and the total terms of monomial be , and for write with ; then we can represent by , where represents each monomial and represents the coefficient correspondingly. Similarly, represent and by and , respectively. The CP-PVC scheme works as follows: KeyGen: For parameter , the client runs a bilinear group generator to generate a bilinear tuple . Choose a tuple . The client runs PRF key generation algorithm KG to generate a key and the range in . Choose a random , and compute Let . The algorithm outputs ProbGen: On inputting , , and owning , the client computes , and . The algorithm outputs Compute: On receiving and , the cloud server computes , and outputs PubVer: On receiving and , the verifier checks the equation The algorithm outputs if the above equation holds; otherwise it outputs . PrivRet: On receiving , , and owning , the algorithm first runs the PubVer algorithm with and . If the PubVer outputs , then compute and return as the output.
4.3. Security Analysis
Now we analyze the correctness, security, and confidentiality of the proposed CP-PVC scheme.
Correctness is easy to prove. Let be as above; then
The above proves the correctness for evaluating , and this implies the correctness of .
We show that, for an adversarial cloud server with the information of , the probability that it can extract the value of the result is negligible; i.e., the result confidentiality holds for cloud server. This implies that the confidentiality also holds for any third party verifier since they have less information () than the cloud server does. To extract , one way is directly from . However, this requires the knowledge of which are parts of the retrieval key and kept secret. The other way is to discover the coefficients of the function and evaluate on since the input value is a plain text stored in . We will show that under this circumstance the adversary cannot discover the coefficients of with a nonnegligible probability. Let be as follows:
and then we have that
Comparing the coefficients and setting of the above equation, we get a system
Note that the above system does not have a unique solution in , and the coefficients of and are chosen uniformly at random from . This means that the probability to choose the correct coefficients is negligible, and thus the privacy of coefficients of is guaranteed, which makes the confidentiality of the final result hold.
The security of the scheme is based on co-CDH assumption and the corresponding proof follows from that in . We take it as an inspiration and define the following four games.
Game 0. It is the same as .
Game 1. It is similar to Game 0, with the difference that contains coefficients randomly chosen from instead of .
Game 2. It is similar to Game 1, with the difference that the ProbGen phase uses an inefficient algorithm instead of the to evaluate .
Game 3. It is similar to Game 2, with the difference that each value is replaced by a random element .
We use a hybrid way to perform the proof, with the following claims.
Corollary 4. .
Proof. This claim holds in an obvious way, since the change of algorithm for evaluating does not change the distribution of its values. Thus the probability that the adversary wins is the same in both Games 0 and 1.
Corollary 5. .
Proof. The difference between Games 2 and 1 is the coefficients of the target function. According to the confidentiality proof, these coefficients are indistinguishable since and are chosen uniformly at random, thus sharing the same distribution in the view of the adversary.
Corollary 6. , where represents the probability in the pseudorandomness definition of PRF.
Proof. The difference between Games 3 and 2 is that the output of PRF is replaced by uniformly random elements in . According to the pseudorandomness property of PRF, the probability that an adversary distinguishes the two values is no better than .
Corollary 7. , whererepresents the probability that an adversary solves co-CDH problem.
Proof. To prove this claim we need to show that if there exists a PPT algorithm that wins in Game 3 with a probability larger than , then one can build an efficient algorithm with oracle access to to solve the co-CDH problem with some nonnegligible probability. Assume that the group in co-CDH problem is described as and the adversary is given a co-CDH tuple with the exponents chosen randomly from ; the algorithm works as follows.
First, needs to simulate public parameter and evaluation key . It computes the bilinear map and chooses random elements . The simulated public key , and the simulated evaluation key . Since the generated in Game 3 contains a PRF as a factor and the PRF enjoys pseudorandomness, it is clear that the and generated above enjoy a perfect distribution as that in Game 3.
Second, gives the simulated and to and prepares to simulate the answers to the queries from . A difference here is that does not need to generate the retrieval key in querying phase as that in Game 3 because does not have access to , neither does need this to solve the co-CDH problem. Let be the querying value. first computes and then computes . Since in Game 3,combining the bilinear property of map , it is clear that the simulated also enjoys a perfect distribution as that in Game 3. is then given the corresponding verification key . Let be the challenge value chosen by . The above process is repeated and obtains .
Finally, will output a tuple such thatwhere and . Due to the correctness property, this also implies that . By we obtain thatLet be the correct output of the PrivRet algorithm. Then by correctness we know is also correct. Thus the following equation also holds:where .
Dividing (25) and (26) and combining bilinear property, we can obtain thatThen can compute . Thus, if wins Game 3 with probability , solves the co-CDH problem with the same probability. This proves Claim 4.
Combining the four claims together, we obtain that
4.4. Performance Analysis
In this subsection, we analyze the computational complexity of the proposed scheme and compare it with some existing works [2, 19]. Our scheme is efficient in the amortized model. That is, the expensive KeyGen algorithm is executed once and its computational overhead is amortized by the following evaluation of the polynomial function with different input value. Here the structure of the function cannot be changed unless another KeyGen algorithm for a different function is performed. Now we give detailed analysis on the computational costs of each algorithm. Let each letter denote the same item as that in the proposed scheme above. In the KeyGen phase, to compute the term , the client needs to execute times of exponential modulo and times of multiplication arithmetic in group . To compute the structure of function , times of multiplication and times of addition arithmetic in are executed. To generate the public key, one time of pairing arithmetic is executed. In the ProbGen phase, no execution needs to be done to generate since remains the same as input . To generate the public verification key , algorithm is executed together with one time pairing arithmetic. And to generate the private retrieval key, the client needs to execute two times of (-variate, 1-degree) sparse polynomial evaluation arithmetic. In the Compute phase, the cloud server needs to execute one time of (-variate,-degree) polynomial evaluation arithmetic to compute and times of exponential modulo in group to compute term . In the PubVer phase, to compute the corresponding terms in the checking equation, the third party verifier needs to execute one time of pairing arithmetic, one time of exponential modulo arithmetic in , and one time of multiplication arithmetic in , respectively. Finally, in the PrivRet phase, the client firstly needs to execute what is done in PubVer phase and then execute, respectively, one time of subtraction and division arithmetic in to compute the result .
We present a complexity comparison between our scheme and existing works of PVC schemes for polynomial evaluation [2, 19]. After analyzing the specific arithmetic of each algorithm, We would like to use the notation to compare computational complexity. Since the arithmetic operations in different groups are the same, we use three types of notation, ,, and , to classify different types of complexity. denotes the time complexity of operations in ring including integer addition and multiplication modulo prime . denotes the time complexity of operations in ring including addition and multiplication. denotes the time complexity of operations in group , , including multiplication, exponentiation, and bilinear map. The comparison result is shown in Table 1. From the table we can see that, compared with the scheme in , ours has a complexity increase of in KeyGen, in ProbGen, and in PrivRet performed by the client. However, since the most expensive arithmetic in each of the three phases is , , and , respectively, the impact of the increasing complexity will not be obvious in the real performance. Compared with the scheme in , we first notice that the scheme in  only works for univariate case, while ours works for multivariate case. If we set “”, the complexity of the two schemes is nearly the same, with a small increase of in ProbGen and PrivRet, respectively, which is not the most expensive part in the two phases. By doing so, our scheme achieves a security property-result confidentiality that is considered to be important for user privacy.
We also provide some experimental simulation of CP-PVC for polynomial to show the efficiency. Intuitively, efficiency means that the total computational overhead on the client side by engaging an outsourcing computation protocol is less than that by directly executing the target function algorithm. Since our scheme is constructed in an amortized model, i.e., the computations in KeyGen need to be done only once, we will not count that part of computation in the efficiency performance. Then for the client side, all the computation costs that are counted contain ProbGen and PrivRet phase. We implement the corresponding algorithms on the client side using MATLAB 2015 on a computer with Intel(R) Core(TM) i7-4790 CPU processor running at 3.60 GHz and 8 GB RAM. We show the time cost for different sizes of problems in Table 2, where represents variable number and represents bit length of the highest degree . This part of work is done in the previous conference version of this paper .
For comparison of efficiency, we also do some evaluation using the direct algorithm , where represents each monomial for polynomial and . The corresponding result is shown in Table 3, which is also done in the previous conference version . We see that the cost time is already infinite when the case goes to . This is reasonable because theoretically the complexity of the direct algorithm is . Recalling that the complexity of the way using closed form efficient PRF is , thus the scheme achieves a log time efficiency promoted for the evaluation of multivariate polynomials. We show in Figure 1 a time cost comparison between direct algorithm and outsourcing algorithm, in which case ; i.e., the degree of each variate is . This is also done in the previous conference version .
5. The CP-PVC Scheme for Matrix-Vector Multiplication
In this section, we first review the that is used for matrix-vector case and give the algorithm for its closed form efficiency. Then we give the concrete construction of our CP-PVC scheme for matrix-vector multiplication together with the analysis and experimental simulation.
5.1. Algorithm for PRF with Closed Form Efficiency
The is defined in another domain, namely, the set .
Let be a group generator that takes as input a secure parameter and outputs a description of group with prime order. The PRF is defined as follows:(i) KG: Run to generate a group description . Choose random values