comparative study between utility computing and grid computing
Post on 05-Apr-2018
222 Views
Preview:
TRANSCRIPT
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
1/24
Comparative Study
betweenUtility
Computingand
Grid
Computing
Utility computing:
is the packaging of computing resources, such as computation,
storage and services, as a metered service. This model has the
advantage of a low or no initial cost to acquire computer
resources; instead, computational resources are essentially rented.
This repackaging of computing services became the foundation of
the shift to "On Demand" computing, Software as a Service and
Cloud Computing models that further propagated the idea of
computing, application and network as a service.
[1]There was some initial skepticism about such a significant shift.
However, the new model of computing caught and eventually
became mainstream.
IBM, HP and Microsoft were early leaders in the new field of Utility
Computing with their business units and researchers working on
the architecture, payment and development challenges of the new
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
2/24
computing model. Google, Amazon and others started to take the
lead in 2008, as they established their own utility services for
computing, storage and applications.
Utility Computing can support grid computing which has thecharacteristic of very large computations or a sudden peaks in
demand which are supported via a large number of computers.
"Utility computing" has usually envisioned some form of
virtualization so that the amount of storage or computing power
available is considerably larger than that of a single time-sharing
computer. Multiple servers are used on the "back end" to make this
possible. These might be a dedicated computer cluster specifically
built for the purpose of being rented out, or even an under-utilized
supercomputer. The technique of running a single calculation on
multiple computers is known as distributed computing.
The term "grid computing" is often used to describe a particular
form of distributed computing, where the supporting nodes are
geographically distributed or cross administrative domains. Toprovide utility computing services, a company can "bundle" the
resources of members of the public for sale, who might be paid
with a portion of the revenue from clients.
One model, common among volunteer computing applications, is
for a central server to dispense tasks to participating nodes, on the
behest of approved end-users (in the commercial case, the paying
customers). Another model, sometimes called the Virtual
Organization (VO), is more decentralized, with
organizations buying and selling computing resources as needed
or as they go idle.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
3/24
The definition of "utility computing" is sometimes extended to
specialized tasks, such as web services.
HISTORY:
IBM and other mainframe providers conducted this kind of
business in the following two decades, often referred to as time-
sharing, offering computing power and database storage to banks
and other large organizations from their world wide data centers.
To facilitate this business model, mainframe operating systems
evolved to include process control facilities, security, and user
metering. The advent of mini computers changed this business
model, by making computers affordable to almost all companies.
As Intel and AMD increased the power of PC architecture servers
with each new generation of processor, data centers became filled
with thousands of servers.
In the late 90's utility computing re-surfaced. InsynQ , Inc.
launched [on-demand] applications and desktop hosting services
in 1997 using HP equipment. In 1998, HP set up the Utility
Computing Division in Mountain View, CA, assigning former Bell
Labs computer scientists to begin work on a computing power
plant, incorporating multiple utilities to form a software stack.
Services such as "IP billing-on-tap" were marketed. HP introduced
the Utility Data Center in 2001. Sun announced the Sun Cloudservice to consumers in 2000. In December 2005, Alexa launched
Alexa Web Search Platform, a Web search building tool for which
the underlying power is utility computing. Alexa charges users for
storage, utilization, etc. There is space in the market for specific
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
4/24
industries and applications as well as other niche applications
powered by utility computing. For example, PolyServe Inc. offers a
clustered file system based on commodity server and storage
hardware that creates highly available utility computing
environments for mission-critical applications including Oracle
and Microsoft SQL Server databases, as well as workload
optimized solutions specifically tuned for bulk storage, high-
performance computing, vertical industries such as financial
services, seismic processing, and content serving. The Database
Utility and File Serving Utility enable IT organizations to
independently add servers or storage as needed, retask workloads
to different hardware, and maintain the environment without
disruption.
In spring 2006 3tera announced its AppLogic service and later that
summer Amazon launched Amazon EC2 (Elastic Compute Cloud).
These services allow the operation of general purpose computing
applications. Both are based on Xen virtualization software and the
most commonly used operating system on the virtual computers
is Linux, though Windows and Solaris are supported. Common
uses include web application, SaaS, image rendering and
processing but also general-purpose business applications.
Utility computing merely means "Pay and Use", with regards to
computing power.
Utility Computing Today:
How does utility computing play out in todays storage and
networking marketplace? Depending on who you talk to, utility
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
5/24
computing might be an IT management approach, a business
strategy or a software/hardware tool. HPs Mark Linesch, VP of
adaptive enterprise programs, put it as well as anybody: Its not
about a big new technology...Its about establishing a tighter, more
dynamic link between the business and its IT infrastructure.
That is because utility computing lives or dies on the integration of
its parts. Utility networks exist today, but true utility computing
requires close coordination between hardware components, the
applications that run on them, and the data management tools
that handle provisioning, storage pooling, and a myriad of tasks
that require wide-scale automation across a utility network. Theutility infrastructure must be able to automatically provision and
deliver resources on demand, while tracking usage for later
chargeback.
Such a level of flexibility and tracking requires management tools
that are currently in their infancy, which explains why not every
company is jumping on the utility bandwagon (basing your
companys IT life on a bunch of relatively untried tools is only for
the very brave or the foolhardy). But the real holdup for utility
computing is that application providers have yet to move en
masse toward UC-ready licensing models. The software licensing
models in particular are currently the barrier to utility pricing
models, says Corey Ferengul, senior vice president at Meta Group.
Ideally, utility computing pricing models would allow customers to
pay by the sip, much as we do with electricity and water. But
software vendors are still predominantly selling their products on a
per-seat or per-CPU basis, regardless of how much or how little an
individual seat or CPU is utilized.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
6/24
Like ILM, utility computing is more a strategic approach than a
specific application or suite of applications. The idea behind utility
computing is to provide unlimited computing power and storage
capacity that can be used and reallocated for any application
and billed on a pay-per-use basis.
Ideally, utility computing brings some important benefits with it.
These include:
Simplified administration. Reduces time-consuming and complex
administration overhead. This will happen faster when going to a
reliable SP model, but internal deployment will yield the same
benefits. Utility computing also needs scalable, standardized, and
heterogeneous computing resources, and should not depend on
highly proprietary hardware or software to work.
Capacity to meet business needs. Enables administrators to
manage fast growth and peaks-and-valleys capacity and
processing demands. Avoids network downtime and lag by
immediately provisioning for changing needs.
Cost-effective. Leverages infrastructure costs to meet changing
business requirements, serves business growth. Automated
provisioning based on need yields excellent ROI on internal
resources.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
7/24
Basic requirements for successful utility computing
Automating costing procedures for computing resources. Billing orchargeback information should be driven by the capacity required
to support business processes. As a result of properly aligning
infrastructure with business processes, the business wants IT to
help minimize the costs of providing business services. Note that
this sounds good on paper but can lead to heavy political
infighting: Many business units hate chargeback because it adds
costs to their bottom line. But in the face of spiraling IT costs - all
of which are coming out of their budget - CIOs are increasingly
unsympathetic.
Automated provisioning to meet the business units scaled-up or
scaled-down needs. Without automated provisioning, IT
departments have to resort to painful manual techniques to deal
with impossibly complex server farms, a plethora of operating
systems, multiplying storage systems and expensive management
software. The better automated provisioning technology gets, the
easier this critical piece of utility computing will become.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
8/24
Virtualization. Virtualization is an underlying technology that
makes it possible to quickly ready storage for incoming
applications. Virtualization actually ranges from a visual screen
where administrators can make changes to their storage
assignments, up to automatic provisioning where the software
does it for you.
Other types of automation . Discovery
.
Automatically identify
storage networking devices hosts, storage, etc. Be able to apply
them to specific business processes. Provisioning
.
This is the big
one. Automation should work to allocate computing power and
storage room to shifting workloads. It should also know how toapply various settings like user authentication and security
policies to various types of data and originating applications.
Configuration
.
Automatically implements network settings across
environments like system configurations, security settings and
storage definitions. Self-
healing.
Automate problem detection and
subsequent correction or recovery.
Flexible systems. Virtualization and automatic provisioning will
have to work across operating systems and switches, and in
multi-vendor environments. And yes, this is a tall order.
Security. If you thought security was tough in a regular network
environment, try a utility computing network that is serving
hundreds or thousands of customers. A case in point is the recent
denial-of-service attack that Sun suffered on the very first day that
the company allowed users to buy Internet access to its much-
hyped, and much delayed, public utility grid.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
9/24
Grid computing and SOA. Grid computing is a form of distributed
computing where resources are often spread across different
physical locations and domains. Grid computing is a foundation
technology for models like utility computing, where computing
resources are pay-per-use commodities. SOA (Service-Oriented
Architecture) is a computing architecture that undergirds the act of
delivering IT as a service. SOA can be used for designing, building,
and managing distributed computing environments, works well
best with standards-based computing resources, and efficiently
enables utility computing infrastructure development.
Utility Computing and SMB :
Amazingly enough, utility computing might not be purely a matter
for the enterprise. IT can be as complex for SMB to manage as for
the enterprise. SMB commonly lacks internal IT skills to optimize
their network infrastructure, and can benefit from a solidly hosted,
reliable and high-performance model. (Internally deploying a utility
computing infrastructure runs into exactly the same challenges
driving SMB to utility computing in the first place. At this point,
most SMBs adopting utility computing will outsource to an SP.)
There are differences between SMB and the enterprise utility
computing models, particularly the lack of a chargeback model in
SMB. According to strategic consultancy THINK strategies, SMBs
utility computing SPs depend primarily on network and
performance management tools, software distribution tools, and
software diagnostic tools to serve their SMB clients.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
10/24
Network Management tools to proactively monitor hardware
states
Performance Management tools to effectively measure network,
system, and software performance
Software distribution tools to automatically update operating
systems and applications from a central console
Software diagnostic tools to perform system and software
analyses, and self-healing techniques
Predictions
I expect utility computing to dovetail with developments in grid
computing, SOA, automated provisioning and discovery, security,
and other foundational technologies. Over time, storage,
databases and applications will increasingly be made available for
customers to access on demand over networks that appear as one
large virtual computing system. Utility computing provides the
enterprise with a charge-back function to support this business
model. SMB will increasingly turn to its own brand of utility
computing, where they turn over network management to an SP.
Utility computing is ultimately about how companies can make
better use of all their computing resources. By delivering fast and
intelligent access to network resources, utility computing
leverages computing infrastructure costs and reduces
management overhead.
A Quick
DefinitionUtility computing is a model in which each IT resource is treated
as a unit of capacity that is delivered when and where it is needed.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
11/24
Resources are shared across applications. Servers and their
associated resources are aggregated into pools and allocated to
applications as needed. When an application needs more
resources -perhaps because of a component failure or a spike in
demand -allocation occurs immediately. When demand subsides,
resources are returned to the shared pools.
Technologies that Come into Play
Server virtualization and systems-based automation are two key
technologies for enabling utility computing. Server virtualization is
the running of multiple server images -called virtual servers (ormachines) -on a single physical host server. To the environment,
virtual servers look just like separate physical servers. Physical
servers can run different operating systems and applications
concurrently in isolated virtual machines and can host virtualized
high-speed network connections between virtual servers. This
technology reduces the number of physical servers and
associated hardware components you need because virtualservers share network and storage devices.
Data center optimization solutions help you plan, build, operate
and manage your utility computing infrastructure. The solutions
simplify management, increase efficiency and reduce costs by
providing:
Automatic discovery,
Modeling of IT infrastructure,
Availability and performance monitoring and management,
Real-time capacity management,
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
12/24
Automatic provisioning of resources, and
A single source of resource reference across all IT disciplines.
Implementing Utility Computing
Implementing a utility computing environment involves four steps:
Discovery,
Analysis and planning (modeling),
Implementation, and
Optimization.
The first step is to discover all assets in your IT infrastructure and
populate a configuration management database (CMDB) with
information about those assets. Information includes IT resources,
their configurations and their users. This information helps you
determine the relationships among the resources and the services
they support. Systems-based solutions come into play here by
populating the CMDB and maintaining its accuracy. Because the
infrastructure changes constantly as you add, update and replace
components, automated discovery must run on a regular basis to
keep the CMDB up to date.
In step 2, analysis and planning, you model the target environment
to create a workload or application perspective of resource
utilization. Modeling takes the guesswork out of physical serversizing and helps you achieve maximum resource utilization. You
interact with the model and vary parameters until you understand:
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
13/24
How many physical systems you can move onto a single physical
host server as virtual machines,
How big the physical host server must be,
The right mix of virtual machines on each physical host server,
How future growth will affect capacity requirements, and
The current resource demand cycles relative to the business
services.
The output of this step is a comprehensive hardware resource plan
-knowledge of the specific hardware configurations needed tosustain the virtual environment.
Step 3 involves moving physical server workloads to virtual
servers. You'll be making a number of potentially complex changes
to the overall IT environment. To minimize risk, you need to
encapsulate all changes within a broader change and
configuration management process that ensures that only planned
changes are authorized, only authorized changes are initiated, and
that changes are implemented as planned and authorized. It's a
good idea to make the move incrementally, starting with
workloads that have the greatest potential for improvement. It's
also a good idea to work in a test environment first.
Automation during this phase helps bring accuracy, repeatability
and scalability of the entire project. To ensure accuracy andcompliance of the resulting total configuration, you need to build a
Definitive Software Library (DSL) that defines specific software
configurations, such as versions and patch levels required for all
the hardware configurations specified in the resource plan.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
14/24
In the fourth step - optimization - you implement the utility
computing environment, which involves automating the sharing
and provisioning of physical and virtual resources and
applications. This orchestration is based on resource allocation
policies that you have established. To ensure service levels, it's
important to provide sufficient time to act when implementing
real-time resource allocation. Traditional change management
relies on manual steps and change approval boards, so it is critical
to establish the automation policies and subject them to
comprehensive, closed-loop change and configuration
management processes prior to implementing real-time resource
allocation strategies. This approach ensures that only planned and
authorized allocation decisions are made in real time.
Virtualization technologies help you put an end to overprovisioning
while still delivering high availability and fast performance.
Moreover, they allow you to take full advantage of utility
computing - so you can optimally match resource capacity with
business requirements through real-time capacity management.
Virtualization and utility computing position you to meet the
demands of business users for a continual stream of new and
more advanced business services. Bottom line: You can adapt to
changing business requirements while continuing to deliver high-
quality business services at the lowest possible cost. BMC
Software offers solutions that can proactively and effectively
manage virtual server environments while minimizing the risks in
deploying virtual servers
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
15/24
Utility Computing Advantages and
Disadvantages:For most clients, the biggest advantage of utility computing is
convenience. The client doesn't have to buy all the hardware,
software and licenses needed to do business. Instead, the client
relies on another party to provide these services. The burden of
maintaining and administering the system falls to the utility
computing company, allowing the client to concentrate on other
tasks.
Closely related to convenience is compatibility. In a large company
with many departments, problems can arise with computingsoftware. Each department might depend on different software
suites. The files used by employees in one part of a company
might be incompatible with the software used by employees in
another part. Utility computing gives companies the option to
subscribe to a single service and use the same suite of software
throughout the entire client organization.
Cost can be either an advantage or disadvantage, depending on
how the provider structures fees. Using a utility computing
company for services can be less expensive than running
computer operations in-house. As long as the utility computing
company offers the client the services it needs to do business,
there's no need for the client to look elsewhere. Most of the cost
for maintenance becomes the responsibility of the provider, not
the client. The client can choose to rely on simplified hardware,
which is less expensive and can be easier to maintain.
However, in some cases what the client needs and what the
provider offers aren't in alignment. If the client is a small business
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
16/24
and the provider offers access to expensive supercomputers at a
hefty fee, there's a good chance the client will choose to handle its
own computing needs. Why pay a high service charge for
something you don't need?
Another potential disadvantage is reliability. If a utility computing
company is in financial trouble or has frequent equipment
problems, clients could get cut off from the services for which
they're paying. This spells trouble for both the provider and the
client. If a utility computing company goes out of business, its
clients could fall victim to the same fate. Clients might hesitate to
hand over duties to a smaller company if it could mean losing dataand other capabilities should the business suffer.
Utility computing systems can also be attractive targets for
hackers. A hacker might want to access services without paying
for them or snoop around and investigate client files. Much of the
responsibility of keeping the system safe falls to the provider, but
some of it also relies on the client's practices. If a company
doesn't educate its workforce on proper access procedures, it's
not hard for an intruder to find ways to invade a utility computing
company's system.
One challenge facing utility computing services is educating
consumers about the service. Awareness of utility computing isn't
very widespread. It's hard to sell a service to a client if the client
has never heard of it. Now that you've read this article, you're
ahead of the game.
As utility computing companies offer more comprehensive and
sophisticated services, we may see more corporations choosing to
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
17/24
use their services. Eventually, it's possible that computers in data
centers miles from your home or office will handle all your
computational needs for you.
A Utility software is actually a kind of computer software which is
designed to help in management and tuning of computer
hardware, operating system and application software. It performs
a single task or a number of small tasks. The examples of Utility
software are as follows:
- Disk defragmenters
- System Profilers
- Virus scanners
- Application launchers
- network managers
- Encryption utilities.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
18/24
Grid Computing :
is a form of distributed computing that involves coordinating
and controlled sharing of diverse computing, applications, data,
storage, or network resources across dynamic and
geographically dispersed multi-institutional virtualorganizations.
A user of Grid computing does not need to have the data and
the software on the same computer, and neither must be on the
users home (login) computer.
Background of Grid Computing :
The idea of Grid computing resulted from the confluence of
three developments:
The proliferation of largely unused computing resources
(especially desktop computers)
Their greatly increased cpu speed in recent years
The widespread availability of fast, universal network
connections (the Internet).
Need for Grid Computing:1. The proliferation of largely unused computing resources(especially desktop computers, of which 152 million were
sold in 2003).
2. Their greatly increased CPU speed in recent years (now >3
GHz).
3. The widespread availability of fast, universal network
connections (the Internet).
4. High performance computers (formerly calledsupercomputers) are very expensive to buy and maintain.
5. Much of the enhancement of computing power recently
has come through the application of multiple CPUs to a
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
19/24
problem (e.g., NCSC had a 720 processor IBM parallel
computer).
6. Many computing tasks relegated to these (especially
massively parallel) computers could be performed by a
divide and conquer strategy using many more, althoughslower, processors as are available on a Grid.
Evolution of Grid :
1. Custom solutions (early 90s)
Metacomputing explorative work
Applications built directly on Internet protocols (TCP/IP)
Limited functionality, security, scalability, and robustness.
2. Open Grid Services Architecture (OGSA) (2002)
Community standard with multiple implementations
Globus GT3 implementation
Service-oriented architecture based on XML Web services.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
20/24
The Globus Toolkit 4 :
The Globus Toolkit of the Globus-Alliance is a Middleware for Grid
Systems. Although it is still in development, almost all major grid
projects project are based on this toolkit, so even Instant-Grid
does. It is therefore shortly described as a referenceimplementation of the OGSA specification. The Globus project
arose from a collaboration between the University of Chicago and
the University of Southern California, with the participation of IBM
and NASA. It is based among others on the experience of other
projects related to Grid-Technology such as Condor, Codine / Sun
Grid Engine, Legion, Nimrod and Unicore. The GT4 offers all
necessary components for the implementation of Grid systems.
This includes areas of security and data-, resources-, andadministrative tasks. In addition, it provides interfaces and libraries
for popular programming environments.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
21/24
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
22/24
Advantages
:
1. No need to buy large six figure SMP servers for applications
that can be split up and farmed out to smaller commoditytype servers. Results can then be concatenated and analyzed
upon job(s) completion.
2. Much more efficient use of idle resources. Jobs can be
farmed out to idle servers or even idle desktops. Many of
these resources sit idle especially during off business hours.
Policies can be in place that allow jobs to only go to servers
that are lightly loaded or have the appropriate amount ofmemory/cpu characteristics for the particular application.
3. Grid environments are much more modular and don't have
single points of failure. If one of the servers/desktops within
the grid fail there are plenty of other resources able to pick
the load. Jobs can automatically restart if a failure occurs.
4. Policies can be managed by the grid software. The software
is really the brains behind the grid. A client will reside on eachserver which send information back to the master telling it
what type of availability or resources it has to complete
incoming jobs.
5. This model scales very well. Need more compute resources?
Just plug them in by installing grid client on additional
desktops or servers. They can be removed just as easily on
the fly. This modular environment really scales well.
6. Upgrading can be done on the fly without scheduling
downtime. Since there are so many resources some can be
taken offline while leaving enough for work to continue. This
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
23/24
way upgrades can be cascaded as to not effect ongoing
projects.
7. Jobs can be executed in parallel speeding performance. Grid
environments are extremely well suited to run jobs that can
be split into smaller chunks and run concurrently on many
nodes. Using things like MPI will allow message passing to
occur among compute resources.
Disadvantages
:
1. For memory hungry applications that can't take advantage of
MPI you may be forced to run on a large SMP.
2. You may need to have a fast interconnect between compute
resources (gigabit ethernet at a minimum). Infiband for MPI
intense applications
3. Some applications may need to be tweaked to take full
advantage of the new model.
4. Licensing across many servers may make it prohibitive for
some apps. Vendors are starting to be more flexible with
environment like this.
5. Grid environments include many smaller servers across
various administrative domains. Good tools for managing
change and keeping configurations in sync with each other
can be challenging in large environments. Tools exist to
manage such challenges include systemimager, cfengine,
Opsware, Bladelogic, pdsh, cssh, among others.
-
8/2/2019 Comparative Study Between Utility Computing and Grid Computing
24/24
6. Political challenges associated with sharing resources
(especially across different admin domains). Many groups
are reluctant with sharing resources even if it benefits
everyone involved. The benefits for all groups need to be
clearly articulated and policies developed that keeps everyonehappy. (easier said than done...)
Areas that already are taking good advantage of grid computing
include bioinformatics, cheminformatics, oil & drilling, and financial
applications.
With the advantages listed above you'll start to see much larger
adoption of Grids which should benefit everyone involved. I believe
the biggest barrier right now is education.
top related