Cluster - is a widely-used term meaning independent computers combined into a unified system through software and networking. At the most fundamental level, when two or more computers are used together to solve a problem, it is considered a cluster. Clusters are typically used for High Availability (HA) for greater reliability or High Performance Computing (HPC) to provide greater computational power than a single computer can provide.
Grid computing - is a form of distributed computing that involves coordinating and sharing computing, application, data, storage, or network resources across dynamic and geographically dispersed organizations. Grid technologies promise to change the way organizations tackle complex computational problems. However, the vision of large scale resource sharing is not yet a reality in many areas — Grid computing is an evolving area of computing, where standards and technology are still being developed to enable this new paradigm. Grid versus Cluster: At a first glance you may think that Grid is just an extension of cluster technology. This is not the case.
Clusters are made up of dedicated components and all components in a cluster are exclusively owned and managed as part of the cluster. All resources are known, fixed and usually uniform in configuration. It is a static environment.
Grids are configured from computer systems that are individually managed and used both as independent systems and as part of the grid. Thus, individual components are not 'fixed' in the grid and the overall configuration of the grid changes over time. This results in a dynamic system that continually assesses and optimises its utilisation of resources.
Thursday, June 16, 2005
Grid Vs Cluster
Posted by Viji Sundararajan at 3:45 PM