What Does Grid Computing Mean?
Grid computing is a processor architecture that combines computer resources from various domains to reach a main objective. In grid computing, the computers on the network can work on a task together, thus functioning as a supercomputer.
Typically, a grid works on various tasks within a network, but it is also capable of working on specialized applications. It is designed to solve problems that are too big for a supercomputer while maintaining the flexibility to process numerous smaller problems. Computing grids deliver a multiuser infrastructure that accommodates the discontinuous demands of large information processing.
Techopedia Explains Grid Computing
A grid is connected by parallel nodes that form a computer cluster, which runs on an operating system, Linux or free software. The cluster can vary in size from a small work station to several networks. The technology is applied to a wide range of applications, such as mathematical, scientific or educational tasks through several computing resources. It is often used in structural analysis, Web services such as ATM banking, back-office infrastructures, and scientific or marketing research.
The idea of grid computing was first established in the early 1990s by Carl Kesselman, Ian Foster and Steve Tuecke. They developed the Globus Toolkit standard, which included grids for data storage management, data processing and intensive computation management.
Grid computing is made up of applications used for computational computer problems that are connected in a parallel networking environment. It connects each PC and combines information to form one application that is computation-intensive.
Grids have a variety of resources based on diverse software and hardware structures, computer languages, and frameworks, either in a network or by using open standards with specific guidelines to achieve a common goal.
Grid operations are generally classified into two categories:
- Data Grid: A system that handles large distributed data sets used for data management and controlled user sharing. It creates virtual environments that support dispersed and organized research. The Southern California Earthquake Center is an example of a data grid; it uses a middle software system that creates a digital library, a dispersed file system and continuing archive.
- CPU Scavenging Grids: A cycle-scavenging system that moves projects from one PC to another as needed. A familiar CPU scavenging grid is the search for extraterrestrial intelligence computation, which includes more than three million computers.
Grid computing is standardized by the Global Grid Forum and applied by the Globus Alliance using the Globus Toolkit, the de facto standard for grid middleware that includes various application components.
Grid architecture applies Global Grid Forum-defined protocol that includes the following:
- Grid security infrastructure
- Monitoring and discovery service
- Grid resource allocation and management protocol
- Global access to secondary storage and GridFTP