Survey On Cloud Computing Storage Security

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

ABSTRACT

Cloud Computing is the provisioning of computing as a service whereby resources and information are delivered to end users over the internet on demand. Thus cloud enables users to access their data from any geographical locations at any time and also has brought benefits in the form of online storage services. Cloud computing works beneath the direction of sophisticated techniques and the marginal requirements for this model to work is a comfortably established network with proper connections to the service providers and end users. Cloud storage service avoids the cost expensive on software, personnel maintenance and provides better performance, less storage cost and scalability. It reduces the problem of local data storage. But the maintenance of stored data in a secure manner is not an easy task in cloud environment and especially that stored data may not be completely trustworthy. Cloud delivers services through internet which increases their exposure to storage security vulnerabilities. However security is one of the major drawbacks that preventing several large organizations to enter into cloud computing environment. In this paper, survey several existing cloud storage frameworks, techniques and their advantages, drawbacks and also discuss the challenges that are required to implement secure cloud data storage. Then use these survey results to identify the future research areas and methods for improving the existing drawbacks. Several methods and techniques have been discussed to ensure storage security in cloud computing.

Index terms - Cloud Computing, Storage Security and Survey.

INTRODUCTION

Cloud Computing is a kind of computing whereby shared resources and IT-related capabilities are provided as a service to outer customers using Internet techniques. Cloud Computing is a type of computing that depends on sharing information and computing resources instead of using local servers or personal devices for to manage supplications. Cloud Computing has began to receive mass attract in corporate organizations as it makes the data center be able to work like the Internet by way of the process of enabling resources to be accessed and shared as virtual resources in a safe and secure manner. To provide data storage service, Cloud Computing utilizes network of enormous amount of servers generally running lower cost customer PC technology with peculiar connections to disperse data processing tasks across end users. Reason for moving into Cloud is simply because of Cloud allows users to access applications from anywhere at any time through internet. But in past, consumers run their programs and applications from software which downloaded on physical server in their home or building. Cloud provides benefits such as flexibility, disaster recovery, software updates automatically, pay-per-use model and cost reduction. However Cloud also includes major risks such as security, data integrity, network dependency and centralization. When storing customer’s data into cloud data storage, security plays a vital role. Sometimes customers store some sensitive information in cloud storage environment. This causes some serious security issues. So providing security to such sensitive information is one of the difficult problems in Cloud computing. In preceding works, several authors proposed methods for securely storing data into Cloud. In this paper, discuss those method’s work, advantages and drawbacks.

STORAGE TECHNIQUES IN CLOUD COMPUTING

In this section, various existing techniques have been discussed. First, Cloud storage is considered to be a network of distributed data centers which typically uses cloud computing technologies like virtualization, and offers some kind of interface for storing data.

Optimal Cloud Storage Systems

Data storage which requires no effort in the cloud is acquiring more popularity for individual, enterprise and institutions data backup and synchronization. The stored data in cloud is shielded from danger, encrypted and duplicated depended on scalability security and needs. In this paper, authors proposed a taxonomic approach for achieving cloud storage service optimality along the consumer’s and resource provider’s lifecycle. Proposed scheme contributes a structural definition of storage systems, optimality storage, storage service ontology and cloud storage controller architecture which is optimality-conscious. When compared with existing work, authors create a more generic and extensible architecture which works as blueprint for optimal cloud storage controller. Also proposed a new prototype called as NubiSave which is available freely and it implements almost all of RAOC concepts. For future work, authors aim to integrate NubiSave prototype with familiar frontends of cloud storage to accomplish a plenty of real users.

Storage Security of data in Cloud

In Cloud Computing, resources are shared via network in the public environment thus it creates severe security troubles. Transmission of data over internet is dangerous to the intruder attack. So data encryption plays an important role in Cloud environment. In this paper, authors introduced a new and consistent security structure for all kinds of cloud and implemented a secure cross platform. The proposed method includes some essential security services such as authentication, encryption and decryption and compression which are provided to Cloud Computing System. Authors created a network framework which consists of three data backups for data recovery. These backups located in remote location from main server. This method used SHA Hash algorithm for encryption, GZIP algorithm for compression and SFSPL algorithm for splitting files. Thus authors proposed a secure cross platform for cloud computing.

Online Data Storage Using Implicit Security

Online data storage using implicit security is more beneficial in a cloud environment. In this paper, authors presented implicit security architecture for online data storage in that security is disseminated among many entities and also the authors look at a more common method of data partitioning. Authors proposed a data partitioning scheme for online data storage that involves the root of a polynomial in finite field. This scheme consists of two parts: first part is (k,k) partitioning scheme and second part is (k,n) partitioning scheme. Data partitions are stored on servers which are chosen in a random manner on network and the partitions are requires to be retrieved to recreate the original data. Reconstruction of data requires access to server, login password and knowledge of server on which those partitions are stored. These pieces are accessible to one who not only have knowledge of passwords but also have the knowledge of where the pieces are stored.

Secure and Efficient Storage Protocol

Current trend is users outsourcing their data at cloud service provider (CSP) who offers enough storage space and low-cost. Thus cloud storage gets an attraction in cloud computing environment. In this paper, authors proposed an efficient and secure storage protocol to ensure the integrity and confidentiality of data stored in Cloud. This protocol is invented by using the construction of Elliptic curve cryptography and using Sobol Sequence to confirm the data integrity arbitrarily. Protocol comprise of three phases such as setup, verification and dynamic data operations and verification. This design allows TPA to audit the integrity of stored data. Cloud Server challenges a random set of blocks that generates probabilistic proof of integrity. Challenge-Response protocol is credential that means it never exposes the data contents to the malicious outsiders. Dynamic data operations at block level are also used to keep the same security assurance and also the proposed scheme relieves both the users and storage service’s afraid about leakage and corruptions of data.

Dynamic Storage way in Cloud Computing

Securely preserving all data in cloud is not an easy job when there is demand in numerous applications for clients in cloud. Data storage in cloud may not be completely trustable because the clients did not have local copy of data stored in cloud. In previous schemes, authors did not tell anything about the integrity of data through its service providers and user by comparing the data update in cloud. To address these issues, in this paper authors proposed a new protocol system using the author’s data reading protocol algorithm to verify the integrity of data insertion before and after in cloud. Clients can check the security of data in cloud with the help of service provider by using the proposed effective automatic data reading algorithm. To recover data in future, authors also proposed a multi server data comparison algorithm with overall data calculation in each update before outsourcing it to server’s remote access point.

Accessing outsourced data efficiently

Authors aimed to propose an approach to attain flexible access control and dynamic large-scale data in a safe and effective way. In this paper, authors proposed an Owner-write-user-read Scenario for accessing data. In this scenario, original data owner be only able to update/ modify their data. Cloud users will be able to read information with corresponding access rights. This proposed approach deals with key generation, dynamics handling and overhead analysis. In key generation part, a key derivation hierarchy is generated and Storage overhead is moderated. Dynamics handling part consists of dynamic data operations and access rights of user. Eavesdropping can be overcome by over-encryption and lazy revocation.

File Assured Deletion (FADE) for Secure Cloud Storage

In this paper, authors proposed a Policy-based file assured deletion scheme that dependably deletes files of cancelled (revoked) file access policies. Working prototype of FADE is implemented at the top of Amazon S3. Performance overhead is also evaluated on Amazon S3.

Policy-based file assured deletion

Data file is logically connected with file access policy and a data key. Each file access policy is connected with a control key. All control keys are maintained by key manager. When a policy is cancelled, control key of that policy will be dispatched from the key manager. The main idea is as follows: each file is saved with data key and this data key is protected with control key. Here key is maintained by key manager. The control key is deleted when a policy is cancelled. So that the encrypted file and data key could not be regained. In case the file is removed still a copy exists, that file is encrypted and unavailable to everyone. Authors also proposed some multiple policies such as conjunctive and disjunctive policies. Conjunctive policies are used to recover file by satisfying all policies whereas disjunctive policies satisfying only one policy. Conclusion is FADE is executable in practice and this approach includes all dynamic data operations. Cryptographic operations are less and meta-data overhead is small.

File Storage Security Maintenance

To ensures the security of data stored in cloud, authors proposed a system which utilizes distributed scheme. This proposed system consists of a master server and a set of slave servers. In this model, there is no direct communication link between clients and slave servers. Master server is responsible to process the client’s requests and at slave server chunking operation is carried out to store copies of files in order to provide data backup for file recovery in future. In contrast to existing systems here users can perform effective and dynamic data operations. Challenge-response protocol is used to achieve fast localization of errors. Clients file is stored in the form of tokens on main server and files were chunked on slave servers for file recovery. Thus proposed scheme achieved storage correctness insurance and data availability by using Token generation algorithm with homomorphic token and merging algorithm were used.

Insuring Efficient and Flexible Distributed Storage System

Authors proposed an effective and flexible distributed system with dynamic data operations explicitly to guarantee the correctness of user’s data in cloud. To ensures data availability and provide redundancies, authors relied on erasure correcting code in the preparation of file distribution. When compared with traditional file distribution technique based on replication this proposed construction extremely reduced the storage and communication overhead. To ensure storage correctness insurance dynamic data operations is of predominant importance. This proposed method utilized erasure correcting code with homomorphic token to accomplish the assurance of storage correctness and fast localization of errors. Use of distributed protocols for storage correctness leads to achieve more robust and secure cloud data storage. In future, aim to construct a framework which supports both public verifiability and storage correctness insurance.

Storing and accessing small files on cloud storage

To support internet services extensively, Hadoop distributed file system (HDFS) is acquired. In this paper, authors examined several reasons for small file problem of native HDFS: Burden on NameNode of HDFS is enforced by large amount of small files, for data placement correlations are not considered, optimization mechanism such as prefetching is not presented. In order to overcome these small size problems, authors proposed an approach that improves the storage and access efficiencies of small files on HDFS. HDFS is an internet file system representative that functioning on clusters. It ignores the problem of storing and accessing of small file. The cut-off point between small and large files is measured in the context of HDFS in an experimental way which helps to improve I/O performance. From a point of taxonomic way, files are classified into three types namely independent files, structurally and logically- related files. Finally prefetching technique is used to make better access efficiency and correlations are considered when storing files. In future, formula for cut-off point will be studied and relationship between storage and access efficiencies will be investigated.

Publicly Auditable Storage Services for Cloud data

Cloud allows data owner to stored their data remotely and utilize the on-demand superior quality cloud applications. Thus outsourcing data into cloud removes the burden of local data storage, hardware management and maintenance of data owner. But at the same time it eliminates the owner’s personal control of storage dependability and security. In this paper, authors proposed publicly auditable cloud data storage service network architecture for efficiently developing, evaluating and describing storage problems. Also recommend a set of cryptographically and consistently suitable properties for public auditing services to become reliable. Authors utilized homomorphic authenticators with random masking technique. This ensures the privacy of data owner’s content during auditing process. Proposed method also supports block level dynamic data operations. Future challenges to public auditing are accountability, performance and multi-writer model.

Fine-grained and Safer Cloud Data Access Control

Moving of user data into cloud reduces the storage burden and also offers more easier way to access user data. But security of data in cloud is the most important issue especially cloud servers are not in entrusted domain. In this paper, authors proposed a system that facilitates data owner to achieve fine-grained access control on data files which are maintained by cloud servers and also concentrated on one-to-many scenario. Each message is associated with a set of attributes and this message is encrypted with public key. Proposed system consists of four entities namely cloud server, data owner, data consumer and optional TPA. Four algorithms were used in this proposed system such as setup, encryption, key generation and decryption. Major advantages are scalable, secure and flexible access control and data confidentiality is also achieved as well.

Identity-Based Authentication

In Cloud Computing, resources and services are distributed across numerous consumers. So there is a chance of various security risks. Therefore authentication of users as well as services is an important requirement for cloud security and trust. When SSL Authentication Protocol (SAP) applied in cloud, it will be very complex so that consumers will meet difficulties in computation and communication and moreover SAP is low efficient. As an alternative to SAP, in this paper authors proposed a new identity-based authentication protocol which is based on identity-based hierarchical model with corresponding signature and encryption schemes. This proposed protocol is certificate-free and well aligned with cloud requirements. Encryption and signature schemes such as Identity-based encryption (IBE) and Identity-based signature (IBS) are proposed to achieve security in cloud communication. When comparing performance, identity-based authentication protocol is very lightweight and more efficient than SAP and also more client side lightweight protocol.

Public Auditing with Complete Data Dynamics support

Cloud Computing shifts the databases and application software into the centralized data centers whereas data management may not be completely trustworthy. Verification of data integrity at unreliable servers is the major concern in cloud storage. This paper first focused to discover the potential security threats and difficulties of preceding works and build a refined verification scheme. In this paper authors proposed a public auditing system with protocol that supports complete dynamic data operations. To accomplish dynamic data support, the existing proof of storage models such as PDP or PoR scheme is improved by manipulating the basic Markle Hash Tree (MHT) for block tag authentication. Proposed system extended to allow TPA to perform multiple auditing tasks by examining the technique of bilinear aggregate signature.

Public Auditing Scheme for Preserving Privacy

By utilizing cloud storage, consumers outsource their data remotely without the worry of maintenance and local data storage. But users no more have the physical control of their outsourced data which introduces integrity risks to the user’s data. Thus public auditing for cloud storage acquires critical importance so that users can delegate their integrity checking task to an optional Third Party Auditor. In this paper, authors proposed a secure cloud storage system that supports privacy preserving public auditing and also extended this work to allow TPA to carry out auditing for multiple customers at the same time. Homomorphic linear authenticator (HLA) based on public key is used to perform auditing effectively and combining HLA with random masking technique to ensure that TPA should not gain knowledge of user data content. Proposed system also supports batch auditing and block-level dynamic data operations.

Efficient Third Party Auditing (TPA)

Users store their data in the cloud so that security and data storage correctness is a primary concern. This study focused on ensuring the integrity and storage security of outsourced data in cloud. In this paper authors introduced a novel and uniform security structure for all kinds of cloud. To achieve data storage security, BLS (Boneh–Lynn–Shacham) algorithm is used to signing the data blocks before outsourcing data into cloud. BLS algorithm is more efficient and secure when compared with other algorithms. Batch auditing is achieved by using bilinear aggregate signature technique simultaneously. Reed-Solomon technique is used for error correction and to ensure data storage correctness. Multiple batch auditing is an important feature of this proposed work. It allows TPA to perform multiple auditing tasks for different users at the same. Future work will be on designing a scheme that supports both public verifiability and storage correctness assurance.

Secure and Dependable Storage Services

Cloud Storage permits users to store their data in cloud and allowed to enjoy the available high quality applications without the worry of local data storage and maintenance. Although cloud provides benefits, such a service gives up the user’s self-control over their outsourced data which introduced new security hazards to the data correctness in cloud environment. To address this new security issue and additionally achieve the cloud data integrity and availability assurances, in this paper authors proposed a flexible distributed storage integrity auditing mechanism. Proposed mechanism allowed users to auditing the cloud data storage and this auditing result utilized Homomorphic token with Reed-Solomon erasure correcting code technique to guarantee the storage correctness insurance as well as identified misbehaving servers rapidly. Authors extended the proposed design to support block-level dynamic data operations such as insertion, deletion, modification and appending. If user does not have time, resources and utility then the users can assign their task to an optional third party auditor (TPA) in safe manner.

Table 1 Comparative analysis on advantage and limitations of existing storage techniques

Storage Scheme

Proposed Approach

Advantages

Restrictions

Secure and Dependable Storage Services

Distributed and flexible storage integrity auditing mechanism.

Homomorphic token with Reed-Solomon erasure correcting code.

Guaranteed the storage correctness insurance as well as misbehaving server identification.

Achieved a well balance between both data dynamics and error resilience.

Overall computation and communication overhead remains approximately same.

Storing and accessing small files on cloud storage

Hadoop distributed file system (HDFS).

Prefetching technique is used to make better access efficiency.

Improves the storage and access efficiencies of small files.

Cut-off point is measured to improve I/O performance

File Assured Deletion (FADE) for Secure Cloud Storage

Policy-based file assured deletion (FADE) scheme.

Conjunctive and disjunctive policies are used for file recovering process.

Provides support for dynamic data operations and meta data overhead is less.

Accessing outsourced data efficiently

An Owner-write-user-read Scenario for accessing data.

Original data owner be only able to update/ modify their data.

Storage Security of data in cloud

New and consistent security structure for all kinds of cloud.

SHA Hash, GZIP algorithm and SFSPL algorithm.

Provided data backups for data recovery.

Includes essential security services such as authentication, encryption and decryption and compression.

Online Data Storage Using Implicit Security

Implicit security architecture and data partitioning scheme for online data storage.

Partitioned data pieces cannot bring out any user information.

In case user forgot where the data stored, it will become difficult for users.

Dynamic Storage way in Cloud Computing

New protocol system using the data reading protocol algorithm.

Multi server data comparison algorithm to recover data.

Integrity can be verified before and after data insertion.

Public Auditing Scheme for Preserving Privacy

Homomorphic linear authenticator (HLA) based on public key is used to perform efficient privacy preserving public auditing.

TPA should not gain knowledge of user data content during auditing process.

Support for batch auditing and dynamic data operations.

Secure and Efficient Storage Protocol

Efficient and secure storage protocol is implemented using Elliptic curve cryptography and using Sobol Sequence

Dynamic data operations at block level are also used to maintain the same security assurance.

Identity-Based Authentication

Authentication protocol based on identity-based

hierarchical model

Proposed authentication protocol is very lightweight and more efficient.

Efficient Third Party Auditing (TPA)

Novel and uniform security structure.

BLS algorithm is used to achieve storage security.

Batch auditing is supported.

TPA performs multiple auditing tasks for different users at the same.

Optimal Cloud Storage Systems

Taxonomic approach for achieving cloud storage service optimality.

Proposed a new prototype called as NubiSave.

Proposed generic architecture served as blueprint for optimal storage controller.

NubiSave is available freely.

File Storage Security Maintenance

Distributed scheme contains master server and a set of slave servers.

Token generation algorithm and merging algorithm are used.

File chunking operation is carried out to provide data backup in case of server failure.

Public Auditing with Complete Data Dynamics support

Public auditing system with protocol that supports complete dynamic data operations.

Basic Markle Hash Tree (MHT) is manipulated for block tag authentication.

TPA performs multiple auditing schemes simultaneously.

Fine-grained and Safer Cloud Data Access Control

Proposed system facilitates data owner to achieve fine-grained access control on data files.

Algorithms used: setup, encryption, key generation and decryption

Scalable, secure and flexible access control and data confidentiality.

Publicly Auditable Storage Services for Cloud data

Network architecture for publicly auditable cloud data storage service.

Utilized homomorphic authenticators with random masking technique

Public auditing becomes reliable due to some consistent cryptographic properties.

Supports dynamic data operations.

Insuring Efficient and Flexible Distributed Storage System

An effective and flexible distributed system with explicit dynamic data operations.

Homomorphic token with erasure correcting code is utilized

Distributed protocols for storage correctness used to achieve more robust and secure cloud data storage.

CONCLUSION

Cloud Computing is an emerging computing paradigm, allows users to share resources and information from a pool of distributed computing as a service over Internet. Even though Cloud provides benefits to users, security and privacy of stored data in cloud are still major issues in cloud storage. Cloud storage is much more beneficial and advantageous than the earlier traditional storage systems especially in scalability, cost reduction, portability and functionality requirements. This paper presented a survey on secure storage techniques in Cloud Computing. First provides an overview of cloud computing basic features, challenges and barriers faced by users and service providers. Then analyzed the various kinds cloud service providers and their characteristics and benefits. Finally several storage techniques that provide security to data in cloud have been discussed in detail and also highlighted the necessity for future research on storage methods to provider much better security and accountability.



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now