Abstract
Cloud storage provides a potential solution replacing physical disk drives in terms of prominent outsourcing services. A threaten from an untrusted server affects the security and integrity of the data. However, the major problem between the data integrity and cost of communication and computation is directly proportional to each other. It is hence necessary to develop a model that provides the trade-off between the data integrity and cost metrics in cloud environment. In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The performance of the entire model is tested against the existing methods in terms of their communication cost, computation cost, and overhead cost. The results of simulation show that the proposed method obtains reduced computational of 0.25% and communication cost of 0.21% than other public auditing schemes.
1. Introduction
The cloud storage behaves as a modern paradigm in cloud computing services that is considered proven to deliver extraordinary services to the management and data storage capabilities. The individuals and cloud enterprises tend to outsource their personal or official data to the cloud server via a pay-as-you-go model. The storage services developed to collect the outsourced data reduce their services greatly that affects the local storage essentialities of users. The integrity of data while verifying it is considered as a significant challenge in case of cloud computing [1, 2].
The data offloading and downloading, on the contrary, are often considered as a major consideration for testing the integrity of outsourced data, and this will increase dramatically the processing and connection overhead. The cloud devices used for storage, on the contrary, are often attacked by the hackers, where the data might get stolen while it is been outsourced. Hence, it is essential for the cloud service provider (CSP) to conceal the outsourced data against loss or corruption in order to maintain the trust of the users. The CSP further may reserve more storage spaces by proper removal of redundant information or the data that are accessed less [3] in order to avoid data leaks or leak of private confidential information. It is hence necessary for the CSP to develop an effective protocol that should validate the data integrity in cloud storage environment.
Various methods are developed in conventional literatures that support the verification of private and public information in handling large data. The verification of data allows the cloud users to validate the integrity of their outsourced data. However, such substantial computing poses a serious burden to the CSP, where the cloud resources are of constrained one. The publication verification of outsourced data reduces the computing cost of the client with the optimal usage of third-party authority (TPA) that helps in checking the data integrity. With such optimal processing and reduced user of resources, the public verification in recent past gained an increased attention [4–14].
In order to support the update of dynamic data, the researchers developed several models [3, 15–18] to update the outsourced data without affecting the completion of download. Certain techniques allow the data to be update dynamically using the cryptographic encryption model, and this requires optimal usage of cloud resources. This would increase the computational and communication cost in significant manner.
There exist various pitfalls that still need to be identified. There exist multiple storage spaces that are explicitly required for the storage of the outsourced data. The storage activities involve the deletion and insertion of the data that may result in increased cost of computation and communication since the movement of data in dynamical way cannot be forfeited. Furthermore, the lack of communication links poses a serious challenge in locating the required outsourced data. This system poses increased severity over forgery attack, replay attack, and replacement attack. In order to mitigate such challenges, an integrity verification is suitably designed in the proposed method that uses cryptographic algorithm to verify the sources.
In this paper, an integrity verification mechanism is formed that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks, and replay attacks. The symmetric encryption guarantees the privacy of the data.
The outline of paper is as follows. Section 2 provides the related works. Section 3 discusses the proposed method. Section 4 evaluates the entire works, and Section 5 concludes the work with possible direction of future work.
2. Related Works
Wang et al. [6] developed an auditing model combining privacy-preserving approach. The model is developed with a homomorphic random mask in order of preventing the TPA from obtaining the data collection without the outsourced data.
Shacham et al. [12] develop an integrity model using public verification scheme that consists of a BLS signature. The BLS [13] tends to use limited resources for its communication and processing requirements.
Chen et al. [14] developed an algebraic solution that is developed to check the integrity of the model. This model improves the efficiency of verification without a public key.
Sookhak et al. [18], on the contrary, develop a limitless verification model, but it suffers mostly from the security flaws. This model computes the secret key based on the signature with the tags and data blocks even if the tags gets attacked.
Juels et al. [19] proposed an identity verification model that helps in preserving the data privacy, where the cloud leverages the user identity to validate the integrity.
Ateniese et al. [20] developed a model that checks the data integrity in the cloud with a technique that combines the block tags with the homomorphic encryption.
3. Proposed Method
We begin by illustrating the system model with ECDLP algorithm.
3.1. System Model
The suggested public verification approach uses a three-party model, as depicted in Figure 1. The following are the roles in this model:(i)Users who rely on the cloud to store the data(ii)A CSP is a company that sells users a lot of storage and computing power(iii)TPA checks the data integrity in response to user requests

In Figure 1, the proposed verification scheme shows the communication between the third-party administrator and cloud service provide in terms of proof information and challenge information between them. The user and TPA process between the version information and verification request and result. The data are only transmitted to CSP after a confirmation is obtained from the TP to CSP.
We present each user ability to implement the proposed paradigm. To begin with, the TPA is thought to be trustworthy yet suspicious. The TPA is truthful in its data integrity checks. Furthermore, the CSP is untrustworthy since it has the option of concealing data loss or corruption in order to maintain the user trust. As a result, the CSP can carry out the following attacks: forgery attack, replay attack, and replacing attack.
A data structure is designed in this part to facilitate dynamic data updating. It combines the benefits of a linked list and hash table in TPA. The ECC is made up of a hash table and several linked lists. The user organises the data into groups that compute the group based on its length. Each group index and length are saved in the hash table. Pointers are linked to data version information.
3.2. Verification Scheme
To begin, the TPA merely has to change the list pointer when inserting and deleting items. Second, the group index may be stored and managed using only a continuous space, which is very practical in practise. Third, if operations such as insertion or deletion are considered frequent, the TPA can change the ECDSA flexibly. As a result, the data format can lower the computational and communication costs of updating procedures dramatically.
3.2.1. Key Initiation
For encrypting data blocks, the user first produces a symmetric key dk. He then chooses x Zq at random and calculateswhere G is an element that is known by TPA and user.
The public key is computed as follows:where d is the scalar.
A two different integers, namely, s and r, are used to compute the signature with proper computation of integer r from a base point G (x, y) and random number k:
Meanwhile, with an algebraic signature, the user selects a secure element. The secret key is (dk, x) and the public key is GA in this case.
In order the signature to be a valid one, the integer r should be treated as null. This helps in the generation of a random number (k), and after this, the integer r is computed again. Once the successful completion of integer r, the integer s is computed as below:where h (m) is the message digest, d and r are the private key, and k is the random number.
3.2.2. Data Blocks’ Encryption
To encrypt each data block Mi, the user uses the symmetric encryption method Enc () with the key dk to acquire the encrypted Mi.
3.2.3. Tag Initiation
For each encrypted data block, the user computes the data block tag i:
The user then deletes the information that was previously saved locally. The TPA is in charge of launching a verification challenge to ensure that the outsourced data are accurate. It is worth to note that the technique uses dk for data protection and tag initiation using x which is used for public verification. It is difficult for attackers to extract x and the public key GA by exploiting an ECDSA characteristic:
3.2.4. Challenge
The user tends to forward the verification request to TPA. The TPA selects the data from the pool of data blocks. The TPA further forwards the challenge information to CSP, which initiates a challenge.
3.2.5. Proof Generation
Once receiving the information on challenge, the CSP estimates the following:
3.2.6. Proof Verification
The TPA is used to estimate the hash value sum.
3.2.7. Signature Verification
The verification of signature is considered as a counterpart while computing the signature. This verifies the authenticity of the message after proof verification using the public key of the authenticator. A secure has algorithm in the formation of signature and helps in computation of authenticator’s signed message digest. This is essentially computed using the components of digital signature r and s and public key Q (x, y).
4. Results and Discussion
In this section, the simulation is conducted in a CloudSim software tool on a high-end computing engine that consists i9 processor with 8 GB RAM. The study transferred image files from the user module to CSP module after getting a verification from TPA. The CSP consists of 10 Virtual Machine (VM) running with 24 cores. The simulation has taken place to assess the communication and computation cost at various ends.
Figure 2 shows the storage computational cost between the proposed public auditing scheme and other methods. The results of simulation show that the proposed method achieves reduced storage cost than other methods. The use of ECDSA helps in reducing the computation cost than other methods.

Figure 3 shows the storage communication cost between the proposed public auditing scheme and other methods. The results of simulation show that the proposed method achieves reduced storage communication cost than other methods. The use of ECDSA helps in reducing the computation cost than other methods.

Figure 4 shows the data communication cost between the proposed public auditing scheme and other methods. The results of simulation show that the proposed method achieves reduced data communication cost than other methods. The use of ECDSA helps in reducing the communication cost than other methods.

Figure 5 shows the data computation cost between the proposed public auditing scheme and other methods. The results of simulation show that the proposed method achieves reduced data computation cost than other methods. The use of ECDSA helps in reducing the computation cost than other methods.

Figure 6 shows the control overhead cost between the proposed public auditing scheme and other methods. The results of simulation show that the proposed method achieves reduced control overhead cost than other methods. The use of ECDSA helps in reducing the overhead cost than other methods.

5. Conclusions
In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks, and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The results of simulation show that the proposed method obtains reduced computational and communication cost than other public auditing schemes. In future, the utilisation of advanced cryptographic encryption models is deployed to improve the rate of reducing the computational and communication cost in cloud systems.
Data Availability
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Ethical Approval
No participation of humans has taken place in this implementation process.
Disclosure
No violation of human and animal rights is involved.
Conflicts of Interest
The authors declare that they have no conflicts of interest.