Cloud computing and grid computing are not the same
While cloud computing and grid computing are both concepts that involve large-scale computer networks, they are certainly not the same thing.
Cloud computing refers to an architecture in which large groups of remote servers are networked to allow centralized data storage and online access to computer services or resources. Of course, there are several cloud technology delivery models (IaaS, PaaS, DaaS and XaaS) and they all have different ways to be delivered.
Grid computing refers to a distributed computing architecture where a set of networked computers (“the grid”, usually PCs) are utilized for large computational tasks, typically ones that are parallel. Basically, grid computing is the collection of computer resources from multiple locations to reach a common goal. The grid can be thought of as a distributed system with non-interactive workloads that involve a large number of files.
Cloud computing vs grid computing
The difference between grid computing and cloud computing is hard to understand because they are not always mutually exclusive. In fact, they are both used to economize computing by maximizing existing resources. However, the difference between the two lies in the way the tasks are computed in each respective environment. In a computational grid, one large job is divided into many small portions and executed on multiple machines. This characteristic is fundamental to a grid; not so in a cloud.
Cloud computing is intended to allow the user to use various services without investing in the underlying architecture. While grid computing also offers a similar facility for computing power, cloud computing isn’t restricted to just that. A cloud can offer many different services, from web hosting to word processing, and it can combine services to present a user with a consistent optimized result.
At the same time, grid computing is a loose network of computers that can be called into service for a large-scale processing task. This network is over the Internet, but only computers that have opted into the grid are called upon. Although distributed geographically, grid computing allows for parallel processing on a massive scale. Shortly said, grid computing is what you want when you have a big processing job to tackle.
Basically, the 2 main benefits of grid computing are: firstly, unused processing power is effectively used, maximizing available resources and, secondly, the time taken to complete the large job is significantly reduced.
On the other hand, cloud computing usually involves accessing resources on an as-needed basis from clusters of servers. These clusters can handle large processing loads, but they are intended to provide scalable processing to users on a smaller scale. Instead of handling one huge task from a single user, cloud computing handles many smaller requests from multiple users. This allows the users to scale up their computer resources for a temporary processing spike without investing in actual servers, which may be recruited only rarely.
Finally, what we have to keep in mind is that although there is a difference in the fundamental concepts of grid and cloud computing that does not necessarily mean that they are mutually exclusive. It is quite viable to have a cloud within a computational grid, as it is possible to have a computational grid as a part of a cloud. They can also be the same network, merely represented in two different ways.
Photo credit: https://www.flickr.com/photos/7897291@N04/8681750288/