The History Of Smart Cities Concept

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

In chapter Smart cities is defined the concept of smart cities and how it contributes to the development and improvement of socio-economic activities of society. To achieve smart cities first were defined some specific elements. This chapter highlight some models of implemented smart solutions in the world and in the end was made a case study about the preparation of employees of a city for smart solutions. This aims to investigate the implications of smart solutions for sustainable city development and to gauge the readiness of employees for smart solutions. These solutions concentrate on the core area of the city administration, education, health, transportation, etc. With this purpose in view the framework for a case study is built up employing a quantitative and qualitative research for a mid-sized Romanian city. Exploratory research techniques combined with applying a survey methodology have been used for studying the preparation of employees for the smart solutions.

SMART CITIES CONCEPT

The concept of the smart city has been introduced to highlight the importance of Information and Communication Technologies (ICTs) in the last 20 years (Schaffers, 2012).

In literature the term smart city is used to specify a city's ability to respond as promptly as possible to the needs of citizens.

Quality of life and city development are profoundly influenced by the core systems (Figure 14) of a city: transport, government services and education, public safety and health (Choenni, 2001). So, we must to start analyze and development of city for these four areas.

Figure The core system of a smart city

Research has focused to study these four areas - education, health, transport, public administration - which are identified having high priority. For these areas was highlight the use of new technologies of employers.

Literature review highlights that various aspects referring to improve life in a city are mentioned in connection to the terms of smart city like: transportation, education, public administration, health care, security/safe, green, efficient and sustainable, energy etc.

In literature (Choenni, 2001), (Dirks, 2010), (Giffinger, 2007) is shown that the most important area for start to transform a city in a smart one is the transportation system. This area has in view to use the modern transport technologies. Smart transportation systems are the best example of the harmony between development of city and modern technologies.

The term smart city is also used in literature (Begawan, 2010), (Caragliu, 2009), (Choenni, 2001), (Schaffers, 2012) regarding the education of its inhabitants. A smart city has therefore smart inhabitants in terms of their educational grade. The intelligent systems represent an important part of future educational process. The intelligent systems will affect the way in which the information is received, used, understand and learned by users.

If the inhabitants will be educate, they will know to work for city development and they will have in view the limits of natural resources. An intelligent educational (Dirks, 2010) system is based on three elements: interconnection (a resource sharing technology education), instrumentation (accumulation of necessary data) and intelligence (making decisions that enhance the learning process).

In other literature (Dirks, 2009) the term smart city is referred to the relation between the city government or public administration and its citizen. Good governance as an aspect of a smart administration often also referred to the usage of new channels of communication for the citizens, e.g. "e-governance" or "e-democracy" (Dirks, 2009).

The health system is the other area which is highlight like a good solution for a smart city and this implies to use modern technologies to better results (Choenni, 2001). The smart health systems have in view to improve the quality of life for patients, allowing timely diagnosis and therapies and, reducing health care costs, reduce time for access to hospital.

The term smart city has attracted a lot of attention in recent years. Since the end of the last century many cities have initiated smart city initiatives.

A useful definition to start to call a city "smart" is when "investments in human and social capital and traditional (transportation) and modern (ICT) infrastructure fuel sustainable economic growth and a high quality of life, with a wise management of natural resources, through participatory government" (Caragliu, 2009).

From the countries which implemented smart cities from the entire world have identified three distinct levels (Spiro, 2006), (Schaffers, 2012) (Figure 15):

The first step for the smart city is based on the physical telecommunications network infrastructure, comprised of the wiring, the wireless, together with any servers and routers required for operating the infrastructure.

The second layer constitutes applications that facilitate operations in the city, like traffic control, etc. Such applications will be provided by many vendors, using the provided infrastructure.

The third step is based by ubiquitous or connectivity of all.

In the last five years, many new ideas have been developed in terms of urban life. One of the most successful concepts is ubiquitous city. This idea was born in South Korea, and wants to be a new model of sustainable economy based on more efficient use of communication solutions, transportation and natural resources. This city manages information technology ubiquitous.

Figure The steps for a smart city

In order to create a complete U-city, we will need to create new smart solutions for water management, traffic and even health care.

This model city was and is being investigated and implemented in European Union cities such as Oulu city in Finland. All information systems from the city are linked and everything is connected to an information system through technologies such as wireless networks.

The concept of ubiquitous city or U-city has been developed into a huge international research and is also known as Smart Cities in other countries. This city is a smart city model based on the use of computer systems to exchange data such as cloud computing, open date.

The first level of this new model of urban life is one in which houses, streets, offices and transport communicate with each other can be accessible from anywhere.

The first U-city was implemented in South Korea (e.g. New Songdo) and Japan (e.g. Osaka) and delivers information anytime, anywhere to anybody, using interconnected information systems and ubiquitous ICT solutions over the city (Bourdeau, 2008).

Figure U-city (Pollalis, 2006)

There are lots of advantages to this new system: energy and all natural resources are spent more efficiently and synchronization tasks much easier to accomplish.

U-city is based on ubiquitous computing and IT solutions to improve the quality of life (Figure 16). These cities use information and communication technology to connect the activities which taking place in a city. In Oulu city in Finland was implemented a UBI - urban interaction program, coordinated by the University of Oulu. This solution is based on hotspots - public interactive screens that facilities effective communication between citizens and government.

All over the world we can identify a lot of smart city (Table 2) in different level of development and is evident that this concept is used to define an urban evolutions based by modern technologies.

Table Smart cities over the world

City

Short Description

Helsinki

Helsinki as a Smart city cluster, including also the Helsinki region, in particular focusing on mobile and wireless technologies and applications.

Lisbon

Lisbon’s ambition as a smart city is to improve the city’s liveliness and quality of life, namely through the active involvement of citizens in the city’s governance model. Lisbon aims to become an international hub for world scale companies, benefiting from the bridge Lisbon represents between Europe, Africa and America.

Manchester

Manchester using modern technologies to promote community engagement, capacity building and social capital

New Sondgo

To use ubiquitous computing in the city is the first objective.

Osaka

Osaka is based by ubiquitous information systems in city area.

Oulu

In the last years Oulu becoming the city of technology and an innovation city. Aim to become the most highly developed city in Finland and Northern Europe.

Barcelona

Barcelona had in view to implement of ICT to pursue social and urban growth. Smart City concept was used as a strategic tool and the pillars are infrastructures, open data, innovation service, human capital.

The international practice shows that the evolution of smart city is based on:

Ubiquitous computing;

Wireless;

Readiness for change, because ICT evolution implied to be ready to use new solutions every time.

The analysis highlight that a smart city is more than technology and infrastructure it is a universe of smart applications and platforms which are empowering citizens in innovative ventures. The strong idea is that a smart city is a strategy and an objective for every urban area and in some part of the world is a reality (Jungwoo, 2011).

According to recent studies (Anthopoulos, 2010), (Begawan, 2010), (Bourdeau, 2008) and (Giffinger, 2007) major advantage is related to improving quality of life.

All the cities who implemented smart solutions had in view to improvement of citizen everyday life. In the last years by implementing smart solutions in different countries from European Union was made:

Increasing the employment rate of employment for men and women aged between 20 and 64 years, while employing a larger number of young people, older and low-skilled people, coupled with a better integration of legal immigrants;

Improving the conditions for research and development in order to increase investment levels and stimulate research, development and innovation of new indicators;

Reduction of greenhouse gas emissions compared, increasing the share of renewable in final energy consumption and achieve increased energy efficiency;

Improving education levels by reducing dropout rates and increasing the proportion of persons aged 30-34 years with university degrees or equivalent qualifications;

Promoting social inclusion by reducing poverty and eliminating the risk of poverty.

The planet development literature uses the concept smart/smarter to refer at intelligence which is being infused into the systems and processes that make the world work. We think that smart/smarter means to infused intelligence into things which no one would recognize as computers: cars, appliances, roadways, power grids, clothes, even natural systems such as agriculture and waterways.

To build a smart planet must to go from three mean ideas:

Instrument the world’s systems;

Interconnect the world’s systems;

Make the world’s systems intelligent.

It is essential for developing a city to have a strategy for a smarter city. This strategy will help determine where and when to invest, will articulate key milestones and returns on investment and can help define an integration/optimization calendar across all systems.

A question of our day is how do the cities smarter and what are the step and principles outlined above in the most cost-effective and productive fashion? The answer is to focus initially on four high-impact areas of improvement (Dirks, 2010):

• Reduce congestion in transport systems – a smart system for traffic;

• Improve government services and public safety – a smart system for government services;

• Improve education systems – a smart system for education;

• Enable appropriate access to healthcare data for better quality of care, early disease detection and prevention – a smart system for health.

Quality of life and the attractiveness of a city are profoundly influenced by the core systems of a city: transport, government services and education, public safety and health (Dirks, 2010).

Research has focused to study areas - education, health, transport, public administration - which are identified having high priority. For this areas were highlight the relationship between sustainable development of city and intelligent systems.

Educational system today is the result of remarkable progress made as a result of using information and communication technologies. Society changes caused by the transmission, storage, processing and access to information and knowledge put their imprint on educational system development. Educational system aims to become intelligence through intelligent systems, modern information and communication technologies. These solutions make daily activities more easily and efficiently.

The existences of an intelligent educational system - focuses on the efficient use of existing infrastructure and modernize it where is necessary - considered crucial during an economic crisis when funds are needed for education.

The intelligent systems represent an important part of future educational process. Them will affect the way in which the information is received, used, understand and learned by users. If the students will be educate, they will know to work for a sustainable city and they will have in view the limits of natural resources.

Intelligent transportation systems (ITS) are the best example of the harmony between sustainability and telecommunications. ITS unclogs highways by directing drivers to the least congested route, which decreases use of natural resources for fuel and reduces pollution emitted by idling cars (Harley, 2004). Similarly, intelligent traffic management system including passenger information displays, which allow passengers to make the most efficient decisions about their transit routes; intelligent system for the collection of all fees related to motor vehicles; and commuter pattern monitoring through digitized transit passes, which allows for enhanced knowledge about travel activities, like multi-stop trips. All of these technologies provide important improvements to transportation city. Through intelligent transportation systems people are increasing their own productivity. They can place phone calls and send emails from the road; as this author previously observed, telecommunications will make the automobile commute into a productive part of the workday (Moss, 2006).With efficiency heightened and pollution reduced, intelligent transportation systems are a solution for a sustainable future.

The smarter approach to healthcare is one that uses information to create real insight into patient care and organizational performance. Healthcare providers, researchers and directors can work smarter by creating comprehensive, holistic views of patient data. They can get real time visibility into how their operations are running. And they can use wider ranging sample data to achieve more medical breakthroughs.

Applications for an intelligent health system:

Intelligent system for data integration and its focus on the patient, so that each person have their own information and have access to a team of specialists who can work across the network. Electronic medical record - eliminating paper records made ​​to reduce medical errors and improve efficiency

Intelligent system that connects doctors, patients and insurance companies

Electronic Bulletin of medical

Programming visit / consultation to electronic medical

University Hospital Motol in Prague - one of the largest health institutions in the Czech Republic completed the first implementation of Grid Medical Archive Solution Europe: a system that provides secure storage and archiving solution for the patients' medical records least 10 years.

Sainte-Justine Hospital in Quebec are using automatic procedures for gathering, managing and updating critical data for research, often scattered in different departments.

Spain Public Health Service has implemented a regional integrated system that allows patients to go to several health centers in the region, with the certainty that that doctor has access to the complete and updated patient data, thus making treatment faster and more accurately.

City governments, which comprise much of the employment and transactions in major urban areas, produce far less paperwork waste with the advent of e-government solutions, intranets and information phone numbers. The intelligent systems allow citizens to pay parking tickets online, processing documents within the government, public procurement tender for access to government services, business forms online, find lost property information, and instantly complete other processes that formerly took months.

Our society is characterized by urbanization – a large number of people live in our days in urban area; technological progress – in every day we can see new solutions for communication, transmission and storage of data; environmental changes – every activity in our day is characterized by the important impact on natural resources how in fact are limited; economical growth – the gross domestic product of our world is substitute by the big cities how bring the people together and stimulate creativity and efficiency.

The entire elements who characterized our society drive us to implement a good manage of infrastructure, of resources and to carer to existing and future needs of citizens.

It can take a lot of time for a city to become really smart. Sometime, the transformation is difficult to do because the mentality of citizens, or other time the evolutions can be stop by the natural disasters. But, to reduce costs, improve efficiencies, and deliver the quality of life citizens expect the implementing of smart city.

The urbanization without precedent of our cities and the technological progress on the hand, and on the other hand for a sustainable progress and for economic growth we need smart solutions for water, energy, transportation, healthcare, education, and safety or we can say we need smart solution for new cities.

Figure Smart city advantages

The most significant advantages (Figure 17) are improved of citizen transportation, the access to city resources (libraries and public buildings, malls, networks etc.) and the opportunities for the employment and local growth (Dirks, 2009).

All these advantages highlight the need of implementations smart solutions in our country.

SPECIFIC ELEMENTS FOR A SMART CITY

In our age the manager of big data for produce knowledge and innovation are the elements which will determined the growth of productivity.

In (Diriks, 2009) is highlight that every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is name big data.

The knowledge and innovation are determined by the investment in manage of data, research, education, development, creativity, transmission.

The efficient manage of big data is a solution for innovation, competition, and productivity.

From big data which offer us a lot of information we can select, using the decision based by our knowledge, the useful solutions and we can produce new knowledge and innovation (Figure 18).

Figure – Knowledge and innovation

Evolutions and development of our society in fact wouldn't exist without innovation. We can say that: "It is all about knowledge and innovation".

All solutions that we have are the result of someone inventing in a better way of doing something, a better tool (Franklin, 2009).

In an economic crisis the need for knowledge and innovation is higher as ever and the analyses of the new solutions for manage data like open data is very important.

Open data is the concept who tries to accentuate the idea that certain data should be freely available to everyone to use and republish as they wish, without restrictions from copyright, patents or other mechanisms of control.

Open data offers the new possibilities to analyze and visualize data from different sources.

Open data can make the world work better and this is no exaggeration, and the reason is that information is a crucial driving force in innovation. The must quantity of data from our world is generating by public sector.

Moreover, the European Council stated in the Visby Declaration (Presidency of the European Council, 2009) said that "European Union (EU) member states should seek to make data freely accessible in open machine - readable formats and stimulate the reuse of public sector information using open data".

The European Commission and the EU member states committed announced in the European eGovernment Action Plan 2011-2015 to "maximizing the value of re-use of public sector information (PSI), by making raw data and documents available for re-use in a wide variety of formats (including machine-readable ones) and languages and by setting up PSI portals" [38]. This highlights the necessity by using open data in every part of our cities.

Figure – Knowledge and innovation

Open data for public sector focuses on the following areas (Figure 19):

Transparency and accountability;

Participation in the decisions;

Decisions;

Communication with enterprisers and citizens;

Participation and citizens engagement;

Internal and external collaboration;

Innovation.

Intelligent processing of data is essential for addressing societal challenges. Data can for example be used to enhance the sustainability of national health care systems.

Over 30% of all data stored on earth is medical data and it is growing rapidly. The medical sector has a problem with this data [40].

Open data for the moment in medicine means only access for medical professionals and patients, protected by laws, rules and regulations.

Open Government Data Initiative (OGDI) [39] from Microsoft is a cloud-based collection of open government software assets that enables publicly available government data to be easily accessible.

Using open standards and application programming interfaces (API), developers and government agencies can retrieve the data programmatically for use in new and innovative online applications.

The Microsoft solutions can [39]:

Encourage citizens and communities to participate with governments;

Enhance collaboration between government agencies and private organizations;

Increase government transparency;

Provide unique insight into data trends and analysis.

OGDI promotes the use of this data by capturing and publishing re-usable software assets, patterns, and practices. OGDI data is hosted in Windows Azure. It is accessible through open, standards-based web services from a variety of development environments, including Microsoft .NET, JavaScript, Adobe Flash, PHP, Ruby, Python, and others.

IBM initiative in this direction is based on Real Web 2.0 Linking Open Data (LOD) [2], a community initiative for moving the Web from the idea of separated documents to a wide information space of data.

The key principles of LOD are that it is simple, readily adaptable by Web developers, and complements many other popular Web trends.

Learn how to make your data more widely used by making its components easier to discover, more valuable, and easier for people to reuse—in ways you might not anticipate [36].

A look back to the immediate history shows that the data storing capacity is continuously increasing, while the cost per GB stored is decreasing, figure 1. The first hard disk came from IBM in 1956, it was called The IBM 350 Disk Storage and it had a capacity of 5 MB, (IBM, 2010). In 1980, IBM 3380 breaks the gigabyte-capacity limit providing storage for 2.52 GB. After 27 years, Hitachi GST that acquired IBM drive division in 2003, deliver the first terabyte hard drive. After only two years, in 2009, Western Digital launches industry’s first two terabyte hard drive. In 2011, Seagate introduces the world’s first 4TB hard drive, [11].

Figure Average HDD capacity (based on (Smith, 2008))

In terms of price (Fern, 2011), the cost per gigabyte decreased from an average of 300.000 $ to a merely an average of 11 cents in the last 30 years, figure 2. As a fact, in 1981 you must use 200 Seagate units, each having a five megabytes capacity and costing 1700$, to store one gigabyte of data.

Figure Average $ cost per GB (based on (Smith, 2008))

The reduce costs for data storage has represented the main premise of the current data age, in which it is possible to record and store almost everything, from business to personal data. As an example, in the field of digital photos, it is estimated that around the world has been taken over 3.5 trillion photos and in 2011 alone have been made around 360 billion new snapshots, (Good, 2011). Also, the availability and the speed of Internet connections around the world have generated increased data traffic that is generated by a wide range of mobile devices and desktop computers. Only for mobile data, Cisco expects that mobile data traffic will grow from 0.6 exabytes (EB) in 2011 to 6.3 EB in 2015, (Cisco, 2010). The expansion of data communications has promoted different data services, social, economic or scientific, to central nodes for storing and distributing large amounts of data. For example, Facebook social network hosts more than 140 billion photos, which is more than double compared to 60 billion pictures at the end of 2010. In terms of storage, all these snapshots data take up more than 14 petabytes. In other fields, like research, the Large Hadron Collider particle accelerator near Geneva, will produce about 15 petabytes of data per year. The SETI project is recording each month around 30 terabytes of data which are processed by over 250.000 computers each day, (SETI project, 2010).The supercomputer of the German Climate Computing Center (DKRZ) has a storage capacity of 60 petabytes of climate data. In the financial sector, records of every day financial operations generate huge amounts of data. Solely, the New York Stock Exchange records about one terabyte of trade data per day, (Rajaraman, 2008).

Despite this spectacular evolution of storage capacities and of deposits size, the problem that arises is to be able to process it. This issue is generated by available computing power, algorithms complexity and access speeds. This paper makes a survey of different technologies used to manage and process large data volumes and proposes a distributed and parallel architecture used to acquire, store and process large datasets. The objective of the proposed architecture is to implement a cluster analysis model.

As professor Anand Rajaraman questioned, more data usually beats better algorithm [15]. The question is used to highlight the efficiency of a proposed algorithm for the Netflix Challenge, (Bennett, 2007).Despite the statement is still debatable, it brings up a true point. A given data mining algorithm yields better results with more data and it can reach the same accuracy of results of a better or more complex algorithm. In the end, the objective of a data analysis and mining system is to process more data with better algorithms, (Rajaraman, 2008).

In many fields more data is important because it provides a more accurate description of the analyses phenomenon. With more data, data mining algorithms are able to extract a wider group of influence factors and more subtle influences.

Today, large datasets means volumes of hundreds of terabytes or petabytes and these are real scenarios. The problem of storing these large datasets is generated by the impossibility to have a drive with that size and more important, by the large amount of time required to access it.

Access speed of large data is affected by the disk speed performances [22], table 1, internal data transfer, external data transfer, cache memory, access time, rotational latency, that generate delays and bottlenecks.

Table Example of disk drives performance.

Interface

HDD

Spindle

[rpm]

Average

rotational

latency [ms]

Internal transfer

[Mbps]

External transfer

[MBps]

Cache

[MB]

SATA

7,200

11

1030

300

8 – 32

SCSI

10,000

4.7 – 5.3

944

320

8

High-end SCSI

15,000

3.6 – 4.0

1142

320

8 – 16

SAS

10,000 / 15,000

2.9 – 4.4

1142

300

16

Source SEAGATE

Despite the rapid evolution of drives capacity, described in figure 1, the large datasets of up to one petabytes can only be stored on multiple disks. Using 4 TB drives requires 250 of them to store 1 PB of data. Once the storage problem is solved another question arises, regarding how easy/hard is to read that data. Considering an optimal transfer rate of 300 MB/s then the entire dataset is read in 38.5 days. A simple solution is to read from the all disks at once. In this way, the entire dataset is read in only 3.7 hours.

If we take into consideration the communication channel, then other bottlenecks are generated by the available bandwidth. In the end, the performance of the solution is reduced to the speed of the slowest component.

Other characteristics of large data sets add supplementary levels of difficulty:

many input sources; in different economic and social fields there are multiple sources of information;

redundancy, as the same data can be provided by different sources;

lack of normalization or data representation standards; data can have different formats, unique IDs, measurement units;

different degrees of integrity and consistency; data that describes the same phenomenon can vary in terms of measured characteristics, measuring units, time of the record, methods used.

For limited datasets the efficient data management solution is given by relational SQL databases, (Pavlo, 2009), but for large datasets some of their founding principles are eroded (Helland, 2011), (Pavlo, 2009), (Tamer, 1996).

The Extract, Transform and Load (ETL) process provides an intermediary transformations layer between outside sources and the end target database.

In literature [29], [30] we identified ETL and ELT. ETL refers to extract, transform and load in this case activities start with the use of applications to perform data transformations outside of a database on a row-by-row basis, and on the other hand ELT refers to extract, load and transform which implied the use first the relational databases, before performing any transformations of source data into target data.

The ETL process (Davenport, 2009) is based on three elements:

Extract – The process in which the data is read from multiple source systems into a single format. In this process data is extracted from the data source;

Transform – In this step, the source data is transform into a format relevant to the solution. The process transform the data from the various systems and made it consistent;

Load – The transformed data is now written into the warehouse.

Usually the systems that acquire data are optimized so that the data is being stored as fast as possible. Most of the time comprehensive analyses require access to multiple sources of data. It’s common that those sources store raw data that yields minimal information unless properly process.

This is where the ETL or ELT processes come into play. An ETL process will take the data, stored in multiple sources, transform it, so that the metrics and KPIs are readily accessible, and load it in an environment that has been modeled so that the analysis queries are more efficient (Vassiliadis, 2009).

The ETL advantages are [47], (Dumey, 2007), (Albrecht, 2009):

save time and costs when developing and maintaining data migration tasks;

use for complex processes to extract, transform, and load heterogeneous data into a data warehouse or to perform other data migration tasks;

in larger organizations for different data integration and warehouse projects accumulate;

such processes encompass common sub-processes, shared data sources and targets, and same or similar operations;

ETL tools support all common databases, file formats and data transformations, simplify the reuse of already created (sub-)processes due to a collaborative development platform and provide central scheduling.

When developing the ETL process, there are two options: either take advantage of an existing ETL tool, some key players, in this domain, are: IBM DataStage, Ab Initio, Informatica or custom code it. Both approaches have benefits and pitfalls that need to be carefully considered when selecting what better fits the specific environment.

MapReduce is a programming model and an associated implementation for processing and generating large data sets, (Dean, 2008). The model was developed by Jeffrey Dean and Sanjay Ghemawat at Google. The foundations of the MapReduce model are defined by a map function used top process key-value pairs and a reduce functions that merges all intermediate values of the same key.

The large data set is split in smaller subsets which are processed in parallel by a large cluster of commodity machines.

Map function (Dean, 2004) takes an input data and produces a set of intermediate subsets. The MapReduce library groups together all intermediate subsets associated with the same intermediate key and send them to the Reduce function.

The Reduce function, also accepts an intermediate key and subsets. This function merges together these subsets and key to form a possibly smaller set of values. Normally just zero or one output value is produced per Reduce function.

In [26] is highlight that many real world tasks such used MapReduce model. This model is used for web search service, for sorting and processing the data, for data mining, for machine learning and for a big number of other systems.

A general MapReduce architecture can be illustrated as Figure 22.

Figure MapReduce architecture (based on (Yu, 2008))

The entire framework manages how data is split among nodes and how intermediary query results are aggregate.

The MapReduce advantages are (White, 2009), (Dean, 2010), (Carroll, 2008), (Dean, 2008), (Pavlo, 2008), (Stonebreaker , 2010), (Dean, 2004).

the model is easy to use, even for programmers without experience with parallel and distributed systems;

storage-system independence as it not requires proprietary database file systems or predefined data models; data is stored in plain text files and it is not required to respect relational data schemes or any structure; in fact the architecture can use data that has an arbitrary format;

fault tolerance;

the framework is available from high level programming languages; one such solution is the open-source Apache Hadoop project which is implemented in Java;

the query language allows record-level manipulation;

projects as Pig and Hive (Yu, 2008) are providing a rich interface that allows programmers to do join datasets without repeating simple MapReduce code fragments;

Hadoop is a distributed computing platform, which is an open source implementation of the MapReduce framework proposed by Google (Yu, 2008). It is based on Java and uses the Hadoop Distributed File System (HDFS). HDFS is the primary storage system used by Hadoop applications. It is uses to create multiple replicas of data blocks for reliability, distributing them around the clusters and splitting the task into small blocks. The relationship between Hadoop, HBase and HDFS can be illustrated as Figure 23.

HDFS

HBase

Hadoop

Input

Input

Input

Output

Output

Output

Figure The relationship between Hadoop, HBase and HDFS (based on (Yu, 2008))

HBase is a distributed database. HBase is an open source project for a database, distributed, versioned, column-oriented, modeled after Google’ Bigtable [46].

Some as the features of HBASE as listed at [46] are:

convenient base classes for backing Hadoop MapReduce jobs with HBase tables including cascading, hive and pig source and sink modules;

query predicate push down via server side scan and get filters;

optimizations for real time queries ;

a Thrift gateway and a REST-ful Web service that supports XML, Protobuf, and binary data encoding options

extensible JRuby-based (JIRB) shell;

support for exporting metrics via the Hadoop metrics subsystem to files or Ganglia; or via JMX.

HBase database stores data in labeled tables. In this context (Yu, 2008) the table is designed to have a sparse structure and data is stored in table rows, and each row has a unique key with arbitrary number of columns.

A distributed database (DDB) is a collection of multiple, logically interconnected databases distributed over a computer network. A distributed database management system (distributed DBMS) is the software system that permits the management of the distributed database and makes the distribution transparent to the users. A parallel DBMS is a DBMS implemented on a multiprocessor computer (Özsu, 1996). Based on the above definitions, we can conclude that parallel database systems improve performance of data processing by parallelizing loading, indexing and querying data. In distributed database systems, data is stored in different DBMSs that can function independently. Because parallel database systems may distribute data to increase the architecture performance, there is a fine line that separates the two concepts in real implementations.

Despite the differences between parallel and distributed DBMSs, most of their advantages are common to a simple DBMS, (Pavlo, 2009):

stored data is conform to a well-defined schema; this validates the data and provides data integrity;

data is structured in a relational paradigm of rows and columns;

SQL queries are fast;

the SQL query language is flexible, easy to learn and read and allows programmers to implement complex operations with ease;

use hash or B-tree indexes to speed up access to data;

can efficiently process datasets up to two petabytes of data.

Known commercial parallel databases as Teradata, Aster Data, Netezza (Henschen, 2008), DATAllegro, Vertica, Greenplum, IBM DB2 and Oracle Exadata, have been proven successful because:

allow linear scale-up and linear speed-up, (Özsu, 1996);

implement inter-query, intra-query and intra-operation parallelism, (Özsu, 1996);

reduced implementation effort;

reduced administration effort;

high availability.

In a massively parallel processing architecture (MPP), adding more hardware allows for more storage capacity and increases queries speeds. MPP architecture, implemented as a data warehouse appliance, reduces the implementation effort as the hardware and software are preinstalled and tested to work on the appliance, prior to the acquisition. It also reduces the administration effort as it comes as a single vendor out of the box solution. The data warehouse appliances offer high availability through built-in fail-over capabilities using data redundancy for each disk.

Ideally, each processing unit of the data warehouse appliance should process the same amount of data at any given time. To achieve that, the data should be distributed uniformly across each processing unit. Data skew is a measure to evaluate how data is distributed across each processing unit. A data skew of 0 means that the same number of records is distributed on each processing unit. A data skew of 0 is ideal.

By having each processing unit do the same amount of work it ensures that all processing units finish their task about the same time, minimizing any waiting times.

Another aspect that has an important impact on the query performance is having the all the data that is related on the same processing unit. This way the time required to transfer data between the processing units is eliminated. For example, if the user requires the sales by country report, having both the sales data for a customer and his geographic information on the same processing unit will ensure that the processing unit has all the information that it needs and each processing unit is able to perform its tasks independently.

The proposed architecture is used to process large financial datasets. The results of the data mining analysis help economists to identify patterns in economic clusters that validate existing economic models or help to define new ones. The bottom-up approach is more efficient, but more difficult to implement, because it can suggests, based on real economic data, relations between economic factors that are specific to the cluster model. The difficulty comes from the large volume of economic data that needs to be analyzed.

The architecture, described in figure 24, has three layers:

the input layer implements data acquisition processes; it gets data from different sources, reports, data repositories and archives which are managed by governmental and public structures, economic agencies and institutions, NGO projects; some global sources of statistical economic data are Eurostat, International Monetary Fund and World Bank; the problem of these sources is that they use independent data schemes and bringing them to a common format it is an intensive data processing stage taking into consideration national data sources or crawling the Web for free data, the task becomes a very complex one;

the data layer stores and process large datasets of economic and financial records; this layer implements distributed, parallel processing;

the user layer provides access to data and manage requests for analysis and reports.

Figure Proposed architecture (Boja et al, 2012)

The ETL intermediary layer placed between the first two main layers, collects data from the data crawler and harvester component, converts it in a new form and loads it in the parallel DBMS data store. The ETL normalize data, transforms it based on a predefined structure and discards not needed or inconsistent information.

The ETL layer inserts data in the parallel distributed DBMS that implements the Hadoop and MapReduce framework. The objective of the layer is to normalize data and bring it to a common format, requested by the parallel DBMS.

Using an ETL process, data collected by the data crawler and harvester gets consolidated, transformed and loaded into the parallel DBMS, using a data model optimized for data retrieval and analysis.

It is important that the ETL server also supports parallel processing allowing it to transform large data sets in timely manner. ETL tools like Ab Initio, DataStage and Informatica have this capability built in. If the ETL server does not support parallel processing, then it should just define the transformations and push the processing to the target parallel DBMS.

The user will submit his inquiries through a front end application server, which will convert them into queries and submit them to the parallel DBMS for processing.

The front end application server will include an user friendly metadata layer, that will allow the user to query the data ad-hoc, it will also include canned reports and dashboards.

The objective of the proposed architecture is to separate the layers that are data processing intensive and to link them by ETL services that will act as data buffers or cache zones. Also the ETL will support the transformation effort.

For the end user the architecture is completely transparent. All he will experience is the look and feel of the front end application.

Processing large datasets obtained from multiple sources is a daunting task as it requires tremendous storing and processing capacities. Also, processing and analyzing large volumes of data becomes non-feasible using a traditional serial approach. Distributing the data across multiple processing units and parallel processing unit yields linear improved processing speeds.

When distributing the data is critical that each processing unit is allocated the same number of records and that all the related data sets reside on the same processing unit.

Using a multi-layer architecture to acquire, transform, load and analyze the data, ensures that each layer can use the best of bread for its specific task. For the end user, the experience is transparent. Despite the number of layers that are behind the scenes, all that it is exposed to him is a user friendly interface supplied by the front end application server.

In the end, once the storing and processing issues are solved, the real problem is to search for relationships between different types of data (Helland, 2011). Others has done it very successfully, like Google in Web searching or Amazon in e-commerce.

A Quality Management Framework in the context of actual services has the following components: on-line service as object (entity), on-line service development and delivery process as process, business and consumer as users, specific service request as request as requirements, evaluation and measurement of the e-service to determine its quality.

On-line service quality management process uses surveys and questionnaires to evaluate the following qualitative aspects of the e-service:

Awareness – the degree of which users are aware of the on-line service existence and its features.

Expectations – what users think that the on-line service offers.

Accessibility – the degree of which all individuals can access service regardless of education, age, sex, culture, religion or the existence of any physical handicap.

Driving reasons for use – what made the user access the on-line service instead of using the traditional method.

Use preventing reasons – what prevents the user from using the service.

Feedback on additional features needed – what users are requesting in order to enhance their experience while using the on-line service

User impact – how on-line service changes user’s routine

Overall satisfaction – how satisfied the user is with on-line service, overall.

Qualitative characteristics of the on-line service are helping to construct an image of the current level of quality. They do have the disadvantage though that they can’t be used in calculations, that they are difficult to compare or aggregate, can’t be used trending analysis and targets can’t be set-up for them.

The quantitative metrics of the quality management framework eliminate the disadvantages of qualitative evaluation. The following are metrics that can be included in the QMF for on-line services:

Accuracy represents the percentage of the number of times the on-line service has provided accurate results to users’ requests.

The degree of satisfaction (Pocatilu, 2007: pp.122-125) can be computed as:

where:

DSR – the degree of satisfaction for the requirement i

TR – total number of requirements

p – the number of requirements

The degree of satisfaction for a user of executive requirement is a value from 0 (no satisfaction) to 1 (fully satisfied).

Repeat consumers represent the percentage of users that have used the same on-line service more than one time.

Awareness represents the percentage of targeted users that are aware of the e-service existence and its features.

Cost represents the fee that has to be paid to access the service. It can be expressed as per-use cost or per-membership cost. Per-use cost implies that the user is going to pay a fee every time he is accessing the service, where per-membership cost implies that the user pays a fee once a period, usually in advance, and gets access to the e-service for that period.

In (Pocatilu, 2007: pp.122-125) the cost of resources takes into account the category of resources and the cost per unit for each category:

where:

NRi – number of resource from the category i

pi – price per unit for the resource category i

di – units of usage for the resource category i

The total cost of on-line service can be defined as:

,

where

k – the number of project phases

ci, - the cost of all resources from the phase i

Request of satisfaction based on time represents the time consumed to access the on-line service. Depending on the e-service nature it can be expressed in seconds, minutes, hours, days, month and even years.

where:

T – period of time

Oi – the output i (deliverables, results)

At a national level the following indexes are used for comparative assessment of the states’ ability to deliver on-line services and products to their citizens.

Web measure index is based on a five stage model (emerging, enhanced, interactive, transactional, and connected) and ranks countries based on their position through the various stages.

Telecommunication Infrastructure Index is a composite index of five primary indices, each weighting 20% in the total value of the index:

Internet Users /100 persons

PCs /100 persons

Main Telephones Lines /100 persons

Cellular telephones /100 persons

Broad banding /100 persons

Human Capital Index is a composite index of the adult literacy rate and gross enrollment ratio. Adult literacy rate is weighted 67% and gross enrollment ratio is weighted 33%.

Readiness index is a composite index comprising the web measure index, the telecommunication infrastructure index and the human capital index.

Table E-Government Readiness for Eastern Europe

Country

Indicator 2008

Indicator 2010

Position 2008

Position 2010

Czech Republic

0.6696

0.6060

25

33

Hungary

0.6494

0.6315

30

27

Poland

0.6134

0.5582

33

45

Slovakia

0.5889

0.5639

38

43

Ukraine

0.5728

0.5181

41

54

Bulgaria

0.5719

0.5590

43

44

Romania

0.5383

0.5479

51

47

Belarus

0.5213

0.4900

56

65

Russia

0.5120

0.5136

60

59

Moldavia

0.4510

0.4611

93

80

As we can see in table 1, Romania ranks 6th in Eastern Europe and 47 in the world in the report by the United Nations - e-Government Survey 2010.

LOCAL DEVELOPMENT THROUGH SMART SOLUTIONS IN EUROPEAN UNION

"Cities generate a lot of useful data" says Tuomo Haukkovaara, General Manager of IBM Finland (IBM, 2012), and all the city must work actively to make such kind of data open.

Open data can make the world a better place and this is no exaggeration, and the reason is that information is a crucial driving force in innovation.

Information is a unique kind of resource. It is a so-called "public good": consumption of information by one individual does not reduce the availability of the information for others. This is why the benefit of information can extend far beyond its initial purpose.

Governments, as a major producer of information, are therefore in a strong position to invest in innovation by promoting open government data [37].

Open data is now seen as a fundamental component of open government – a broader strategy that looks at the evolving roles of governments and citizens in delivering public services to society (Huijboom, 2011).

Open government focuses (Figure 25) on the following areas [35]:

Transparency and accountability;

Participation and citizen engagement;

Internal and external collaboration;

Innovation.

Figure Open Government

The expectations are that by adopting open government principles, governments – in collaboration with business, non-profit organizations and engaged citizens – can deliver more services with higher quality and improved democracy.

Some of the most important benefits of open data solutions are [36]:

Delivering trusted information through integration: For example, master data management can enable a single version of the truth across all information to be maintained without a rip and replace strategy.

Information security and information governance: It is important to specify and enforce the ways in which information is created, stored, used, archived and deleted. This can include defining processes, roles, standards and metrics.

Data Quality: It is important to ensure clean, standardized, and non-duplicated information. It could be useful to indicate the degree to which data has been cleaned, validated, etc., to help bolster user confidence.

Real time connectivity: Providing access to diverse and distributed information in real time may involve supporting federated queries, single sign-on, unified views, and bi-directional data access services.

Identity and relationships resolution: As an example, managing global names for identity enablement may be required.

We can say that the open data is a young movement. The first step in this direction was made by the U.S. Government by the www.data.gov site launched in mid-2009 and U.K. Government by data.gov.uk, in 2010.

These two sites are well known, and are the most advanced open data sites which exist today.

The open data movement is spreading rapidly and today many countries, states, regions and cities are taking their first steps with open data sites.

In May 2010, the Australian government published a Declaration of Open Government (AGIMO, 2010), (Huijboom, 2011) in which it supported informing and engaging citizens through increased government transparency.

In other Western countries open data has increasingly been placed on the agenda by politicians and policy makers.

The Danish government launched an Open Data Innovation Strategy in July 2010 (Danish Ministry of Science, Technology and Innovation, 2010) and several regions in Spain have actively developed open data policies (the Basque Country, Catalonia and Aragon) (Huijboom, 2011).

The Spain – Minister of Industry, Tourism and Market in 2010 launched an Open Data Innovation Strategy (‘Avanza2’). In this project is highlight that data are crucial for the knowledge economy. By publishing Public Sector Data, more (economic) value can be generated (Huijboom, 2011).

The data are the most important source for the development of new products and services. In addition, data are important to exercise one’s democratic rights. Citizens are better informed about and engaged in government.

Implementing an open data strategy is not a simple task.

In addition is essential when we want to invest in the technology and operations of open data sites to have in view the barriers of these. We can identify the usual barriers [3] to successful implementation, such as:

cultures opposed to openness;

data quality problems;

and difficulties in developing appropriate models for charging for open data .

The instruments applied by countries to implement open data policy can divide in four types:

education and training (Knowledge exchange platforms, conference, sessions, workshops),

voluntary approaches (Overall strategies and programmers, General recommendations, Public voluntary schemes),

economic instruments (Competitions, app contests and camps, Financing of open data portals);

legislation and control (Public sector information law, Technical standards, Monitoring).

The IBM team was asked to help the Helsinki city to develop strategies for [35]:

creating visualizations that can enable citizens make use of and benefit from open data;

define the components necessary to grow a sustainable, repeatable platform, process and ecosystem to leverage the principles of open data, turning data into information, information into action, and action into change [35].

The expectations are that by adopting open data principles, governments – in collaboration with business, non-profit organizations and engaged citizens – can deliver more services with higher quality and improved society [35], [36].

In the entire world we can find a lot of cities who implemented different smart solutions for become smarter.

Dubuque, Iowa is an example of smart city. They use the ICT to optimize resources and operations for citizen. They have in view three major themes: economy prosperity, quality of life and environmental integrity. The model (Kim, 2009) was based on 11 core principles including smart use of energy, water, and other resource lick green buildings and great community knowledge. Dubuque has been a pioneer in putting citizens at the center of smarter city transformation and encourages economic growth.

Seoul, South Korea capital is among the first city who offers solution for universal connectivity (Washburn, 2009). The many benefits of this solution are the access of administrative information and the transparency of services. In Seoul are developing a project for individual apartments. They use the future panels in each the room that control lighting, temperature and access to media. The effect of this new solution is immediately and it can see in consume of water and energy.

In Bari, Italy using a touch screen installed on fishing boats, local fishermen can immediately determine where they can seals the fish.

Directly from the boats, using simple touch screen systems by accessing a cloud computing solution, fishermen enters the type of fish caught and instantaneously start a virtual auction with wholesalers on the docks [44].

The University of Bari has also developed a smart systems focused on wine production.  Winemakers at up to 60 cooperative wineries from South of Italy are able to determine market demand for various types of wines by accessing the cloud computing solution [44].

A smart traffic system helped the city of Stockholm, Sweden (Palmisano, 2008) reduce traffic by 20%, reduce emissions by 12% and increase public transportation. Stockholm has implemented a system that automatically charges drivers a fee based on how much they drive (using control points and camera) to reduce congestions and greenhouse gas emissions.

In the entire world we can find a lot of examples of implementing smart solutions and it is essential to generate a model for a smart city.

A question of our day is how do the cities smarter and what are the step and principles outlined above in the most cost-effective and productive fashion? The answer is to focus initially on four high-impact areas of improvement (Dirks, 2010):

Reduce congestion in transport systems – a smart system for traffic;

Improve government services and public safety – a smart system for government services;

Improve education systems – a smart system for education;

Enable appropriate access to healthcare data for better quality of care – a smart system for health.

All the research highlight that we can implementing smart solutions in our city to become smarter. For this we can start from four areas (transportation, education, healthcare and public administration) that are the core system of the city.

For a smart city we must to have in view to interconnect all the city systems. This can be do step by step start from one system and then extended to the entire city.

Every system can be characterized by some function: sensing, information management, analytics and modeling, and influencing outcomes. For best results these systems must to be integrated in one system.

Analyzing the models of smart cities that exist among the world I identified some priority categories: model for resource consumption, model for traffic congestion, model for healthcare and model for educational system. All the models have in view to optimize control of the city systems and to offer citizens better outcomes and a quality life.

The first step for a smart city is to observing and understanding city activity. In this case we will know where to implementing the smart solutions. For a city who have a big problem with traffic we consider that is a good idea to start with a model for traffic congestion, and on the other hand in a city with a lot of consume of water we will start with a smart solution for reduce the water consume. For this we must to analyze all the important areas of the city.

The second step is to select the primordial area of the city and implement smart solution.

The third step is to extend the smart solutions to other areas of the city and to manage information across all the city systems.

And the last step is to optimize the solutions for improve the quality of life.

LOCAL DEVELOPMENT THROUGH SMART SOLUTIONS IN ROMANIA

For our country we must to point that a smart city is a strategy not a reality yet. In Romania, starts from the steps identified above and according to data from Eurostat regarding the use of modern technologies, most cities are still found in the first step of the smart evolutions. In this step in which is trying to improve telecommunications network infrastructure are most of the cities of our country.

To analyze the stage of our country it's necessary to investigate the implications of smart solutions for sustainable city development and to gauge the readiness of employees for smart solutions. These solutions concentrate on the core area of the city administration, education, health, transportation, etc. With this purpose in view the framework for a case study is built up employing a quantitative and qualitative research for a mid-sized Romanian city. Exploratory research techniques combined with applying a survey methodology have been used for studying the preparation of employees for the smart solutions. A set of derived procedures have been employed for collecting and analyzing more than 400 observations from a heterogeneous population. They have been correlated with indicators able to characterize the sustainable city development, so as to point out the impact of the smart solutions and the possibilities to use their facilities in this respect. The results show that smart solutions are highly recommended for the future sustainable development and they are almost ready to penetrate the city from a technological perspective but information and understanding of citizens with regard to this new way of evolution are still lacking.

The survey concentrated on the core area of the city administration, education, health, transportation, etc. This was performed among the employees for a mid-sized Romanian city Râmnicu Vâlcea in order to investigate three main issues:

the use of present communication infrastructure; how employees perceive the use of online solutions in their activities;

how existing communication infrastructure is used for employees activities;

if the employees consider beneficial the relationship established between companies through smart solutions.

The survey consisted of 18 questions targeting the following categories:

general information about the employees: age, gender;

questions regarding how existing communication infrastructure is used for their activities;

questions regarding how online communication changes employe



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now