Distributed Computing
What is Distributed Computing?
Distributed computing is a model in which software system components are shared across different computers. Although the components are spread over several computers, they operate as a single system. This is typically designed to increase productivity, fault tolerance, and overall performance.
All computers are linked together in a network, either a Local Area Network (LAN) or a Wide Area Network (WAN), and communicate with each other so that various parts of a distributed program can operate on separate computers in different geographic locations. A distributed program is a computer program that works within a distributed system.
Distributed systems are defined by their structure. A typical distributed system would contain many interconnected devices, each of which runs its software but is influenced by receiving messages, monitoring shared-memory changes, or the statuses of other devices. Distributed systems span from basic techniques in which a single client communicates with a single server to massive amorphous networks such as the internet.
Distributed Computing Architecture
The architecture of distributed computing is defined by several hardware and software levels. At a more fundamental level, it is essential to connect numerous central processing units (CPUs) via some form of network, whether that network is inscribed onto a circuit board or is composed of loosely related devices and connections. At a higher level, some communication mechanism is required to connect processes operating on those CPUs. Some of the basic architectures of distributed computing are: client-server, three-tier, n-tier, and peer-to-peer
Client-server
A Client-server is an architecture where one or more client devices query a server for data, and the server responds back to each client. When client-server applications first became popular, clients would often be far lower cost and less powerful than the server. Sharing the server in this way would reduce the total cost of the system. Some people no longer consider client-server to be an example of distributed computing as the system components mostly reside on the server, rather than being distributed more widely.
Three-tier
Three-tier architectures introduce an additional tier to the client-server model, often called the agent tier. The agent tier holds the client data, so the client no longer has to manage its own information. Many web apps are three-tiered.
n-tier
n-tier, also called multi-tiered, is an architecture often used by web apps that route queries to multiple web services. An n-tier system typically separates processing, data management and presentation functions, with each component being hosted on a different machine or machine cluster. Separating each component can make the overall system easier to maintain, manage and scale.
Peer-to-peer
Peer-to-peer architectures lack specific computers that deliver services or regulate network resources. However, all duties are evenly distributed across all machines, referred to as peers. Peers can act as both clients and servers. Examples of peer-to-peer architecture are BitTorrent and the bitcoin network.
Use and Application
- Distributed computing has evolved into a critical foundational technology in digitizing our personal and professional lives. The internet and the services it provides would not be conceivable without distributed system client-server architectures.
- All Google search includes distributed computing, with provider instances worldwide cooperating to produce relevant search results. Google Maps and Google Earth both use distributed computing to provide their services.
- Email and conferencing systems, airline and hotel reservation systems, libraries, and navigation systems leverage distributed computing methodologies and designs.
- Sensor and monitoring systems in meteorology depend on the processing capacity of distributed networks to anticipate natural catastrophes. Several modern digital applications are built on distributed databases.