By Kamal Garg CSE
Meaning
Cloud
computing refers to the delivery of computing and storage capacity[citation
needed] as a service to a heterogeneous community of end-recipients. The name
comes from the use of clouds as an abstraction for the complex infrastructure
it contains in system diagrams. Cloud computing entrusts services with a user's
data, software and computation over a network. It has considerable overlap with
software as a service (SaaS).
End
users access cloud based applications through a web browser or a light weight
desktop or mobile app while the business software and data are stored on
servers at a remote location. Proponents claim that cloud computing allows
enterprises to get their applications up and running faster, with improved
manageability and less maintenance, and enables IT to more rapidly adjust
resources to meet fluctuating and unpredictable business demand. Cloud
computing relies on sharing of resources to achieve coherence and economies of
scale similar to a utility (like the electricity grid) over a network
(typically the Internet). At the foundation of cloud computing is the broader
concept of converged infrastructure and shared services.
History
The
term cloud is used as a metaphor for the Internet, based on the cloud drawing
used in the past to represent the telephone network, and later to depict the
Internet in computer network diagrams as an abstraction of the underlying
infrastructure it represents. In the 1990s, telecommunications companies who
previously offered primarily dedicated point-to-point data circuits, began
offering virtual private network (VPN) services with comparable quality of
service but at a much lower cost. By switching traffic to balance utilisation
as they saw fit, they were able to utilise their overall network bandwidth more
effectively. The cloud symbol was used to denote the demarcation point between
that which was the responsibility of the provider and that which was the
responsibility of the user. Cloud computing extends this boundary to cover
servers as well as the network infrastructure. The underlying concept of cloud
computing dates back to the 1960s, when John McCarthy opined that
"computation may someday be organised as a public utility." Almost
all the modern-day characteristics of cloud computing (elastic provision, provided
as a utility, online, illusion of infinite supply), the comparison to the
electricity industry and the use of public, private, government, and community
forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge
of the Computer Utility. Other scholars have shown that cloud computing's roots
go all the way back to the 1950s when scientist Herb Grosch (the author of
Grosch's law) postulated that the entire world would operate on dumb terminals
powered by about 15 large data centers. An early but surprisingly complete
implementation of cloud computing was implemented and patented (in Germany and
England) by Hardy Schloer (which he termed the "one-page web") with multiple user applications, multiple
identification providers, cloud storage, back-end servers with plug-in
applications, a multiple tiered server architecture able to handle different
user devices over the internet, and built-in security features.
The
ubiquitous availability of high capacity networks, low cost computers and storage
devices as well as the widespread adoption ofhardware virtualization,
service-oriented architecture, autonomic, and utility computing have led to a
tremendous growth in cloud computing.
After
the dot-com bubble, Amazon played a key role in the development of cloud
computing by modernising their data centers, which, like most computer
networks, were using as little as 10% of their capacity at any one time, just
to leave room for occasional spikes. Having found that the new cloud
architecture resulted in significant internal efficiency improvements whereby
small, fast-moving "two-pizza teams" could add new features faster
and more easily, Amazon initiated a new product development effort to provide
cloud computing to external customers, and launched Amazon Web Service (AWS) on
a utility computing basis in 2006. In early 2008, Eucalyptus became the first
open-source, AWS API-compatible platform for deploying private clouds. In early
2008,OpenNebula, enhanced in the RESERVOIR European Commission-funded project,
became the first open-source software for deploying private and hybrid clouds,
and for the federation of clouds. In the same year, efforts were focused on
providing quality of serviceguarantees (as required by real-time interactive
applications) to cloud-based infrastructures, in the framework of the IRMOS
European Commission-funded project, resulting to a real-time cloud environment.
By mid-2008, Gartner saw an opportunity for cloud computing "to shape the
relationship among consumers of IT services, those who use IT services and
those who sell them" and observed that "[o]rganisations are switching
from company-owned hardware and software assets to per-use service-based
models" so that the "projected shift to computing... will result in
dramatic growth in IT products in some areas and significant reductions in
other areas." In 2012, Dr. Biju John and Dr. Souheil Khaddaj incorporated
the semantic term into the cloud "Cloud computing is a universal
collection of data which extends over the internet in the form of resources
(such as information hardware, various platforms, services etc.) and forms
individual units within the virtualization environment. Held together by
infrastructure providers, service providers and the consumer, then it is
semantically accessed by various users." (CLUSE 2012), Bangalore, April
2012
Characteristics
Cloud
computing exhibits the following key characteristics:
Agility improves with users' ability to
re-provision technological infrastructure resources.
Application programming interface (API)
accessibility to software that enables machines to interact with cloud software
in the same way the user interface facilitates interaction between humans and
computers. Cloud computing systems typically use REST-based APIs.
Cost is claimed to be reduced and in a
public cloud delivery model capital expenditure is converted to operational
expenditure. This is purported to lower barriers to entry, as infrastructure is
typically provided by a third-party and does not need to be purchased for
one-time or infrequent intensive computing tasks. Pricing on a utility
computing basis is fine-grained with usage-based options and fewer IT skills
are required for implementation (in-house). The e-FISCAL project's state of the
art repository contains several articles looking into cost aspects in more
detail, most of them concluding that costs savings depend on the type of
activities supported and the type of infrastructure available in-house.
Device and location independence enable
users to access systems using a web browser regardless of their location or
what device they are using (e.g., PC, mobile phone). As infrastructure is
off-site (typically provided by a third-party) and accessed via the Internet,
users can connect from anywhere.
Virtualization technology allows
servers and storage devices to be shared and utilization be increased.
Applications can be easily migrated from one physical server to another.
Multitenancy enables sharing of
resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in
locations with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users
need not engineer for highest possible load-levels)
Utilisation and efficiency improvements
for systems that are often only 10–20% utilised.
Reliability is improved if multiple
redundant sites are used, which makes well-designed cloud computing suitable
for business continuity and disaster recovery.
Scalability and Elasticity via dynamic
("on-demand") provisioning of resources on a fine-grained,
self-service basis near real-time, without users having to engineer for peak
loads.
Performance is monitored, and
consistent and loosely coupled architectures are constructed using web services
as the system interface.
Security could improve due to
centralization of data, increased security-focused resources, etc., but
concerns can persist about loss of control over certain sensitive data, and the
lack of security for stored kernels. Security is often as good as or better
than other traditional systems, in part because providers are able to devote
resources to solving security issues that many customers cannot afford.
However, the complexity of security is greatly increased when data is
distributed over a wider area or greater number of devices and in multi-tenant
systems that are being shared by unrelated users. In addition, user access to
security audit logs may be difficult or impossible. Private cloud installations
are in part motivated by users' desire to retain control over the
infrastructure and avoid losing control of information security.
Maintenance of cloud computing
applications is easier, because they do not need to be installed on each user's
computer and can be accessed from different places.
Relevance
in today’s life
Many
universities, vendors and government organizations are investing in research
around the topic of cloud computing:
In October 2007, the Academic Cloud
Computing Initiative (ACCI) was announced as a multi-university project
designed to enhance students' technical knowledge to address the challenges of
cloud computing. In April 2009, UC Santa Barbara released the first open source
platform-as-a-service, AppScale, which is capable of running Google App Engine
applications at scale on a multitude of infrastructures.
In April 2009, the St Andrews Cloud
Computing Co-laboratory was launched, focusing on research in the important new
area of cloud computing. Unique in the UK, StACC aims to become an
international centre of excellence for research and teaching in cloud computing
and will provide advice and information to businesses interested in using
cloud-based services
In October 2010, the TClouds
(Trustworthy Clouds) project was started, funded by the European Commission's
7th Framework Programme. The project's goal is to research and inspect the
legal foundation and architectural design to build a resilient and trustworthy
cloud-of-cloud infrastructure on top of that. The project also develops a
prototype to demonstrate its results.
In December 2010, the TrustCloud
research project was started by HP Labs
Singapore to address transparency and accountability of cloud computing via
detective, data-centric approaches
encapsulated in a five-layer TrustCloud Framework. The team identified
the need for monitoring data life cycles and transfers in the cloud, leading to
the tackling of key cloud computing security issues such as cloud data
leakages, cloud accountability and cross-national data transfers in
transnational clouds.
In July 2011, the High Performance
Computing Cloud (HPCCLoud) project was kicked-off aiming at finding out the
possibilities of enhancing performance on cloud environments while running the
scientific applications - development of HPCCLoud Performance Analysis Toolkit
which was funded by CIM-Returning Experts Programme - under the coordination of
Prof. Dr. Shajulin Benedict.
In June 2011, the Telecommunications
Industry Association developed a Cloud Computing White Paper, to analyze the
integration challenges and opportunities between cloud services and traditional
U.S. telecommunications standards.
Advantages
There
are many possible advantages of cloud computing, but they may not apply to all
consumers.
Reduced
costs
Cloud
services paid for on a usage basis can be financially advantageous for a
consumer when compared to the outright purchase, or long-term rental, of what
would be a big-budget item. Also, there are reduced operating costs, because a
cloud consumer does not need to house, staff and maintain their own equipment.
Up-to-date
software
SaaS
consumers can always have the most up-to-date software, because versioning is
controlled centrally by the cloud provider, and when they make a new release it
is automatically available to every user.
This
is particularly advantageous for cloud desktops, because deployment of new
software versions can be very costly and time consuming for a large organisation
with many PCs, and because it can therefore be difficult to ensure that
everyone has the same version of the organisation's PC software applications at
any one time.
Improved
access
Cloud
computing involves using the Internet, and this can provide access from
multiple locations and many different types of user device.
Sharing
and co-operation
Cloud
services are advantageous, when compared to PCs and local servers, for
activities that require co-operation among distributed groups.
Flexible
and infinite scaling
Flexible
and infinite scaling can be an advantageous feature of cloud-computing
services, for example to allow for a sudden increase in demand by the users.
This has traditionally been a difficulty for fully owned and self-managed IT
resources, where there can be, for example, one server with a given, fixed
size, and where some of its capacity may be wasted when demand is low, but
where it may be overloaded, resulting in slow response times, when demand is
high.
Simpler
capacity planning
Cloud
computing moves the IT capacity-planning role from the consumer to the cloud
provider, and they can be in a better position to optimize the cloud resources
used by their consumers than the consumers themselves would be for their own
resources.
For
example, the provider may be able to supply better demand smoothing, because
they can perform capacity planning over a much larger pool of resources, and
for a large group of consumers, whose peak loads will probably not occur all at
the same time.
Risks
Besides
the advantages of cloud computing, there are also risks, at least for some
consumers.
No comments:
Post a Comment