Concurrent, parallel and distributed computing

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

The book "Concurrent, Parallel, and Distributed Computing" offers an excellent overview of the various areas of the computing field. There is a lot of overlap between the words "concurrent computing," "parallel computing," and "distributed computing," and there is no obvious differentiation between them. The same system can be described as "parallel" and "distributed"; in a typical distributed system, the processors run concurrently in parallel.The content in the book is presented in such a way that even a reader with no prior knowledge of computers may understand it and become acquainted with the fundamental concepts of computing. It offers numerous small examples, demonstration materials, and sample exercises that teachers can use to teach parallel programming principles to students who have just recently been introduced to basic programming concepts. It focuses on Python multiprocessing features like fork/join threading, message passing, sharing resources between threads, and using locks. Parallelism's utility can be seen in applications like searching, sorting, and simulations. Students and researchers can get an accessible and comprehensive explanation of the concepts, guidelines, and, in particular, the complex instrumentation techniques used in computing. However, the concurrency is inherently difficult because it is not very straightforward to identify racing circumstances via experimentation. Even if a test finds a flaw, pinpointing the section of the software that is producing it might be difficult. Concurrency difficulties are notoriously hard to replicate. It is tough to get them to occur in the same method recurrently. The comparative accuracy of predictions that are highly impacted by the atmosphere determines how commands or information are interleaved. Additional running applications, other network activity, system software development selections, differences in computer timepiece speed, and so on may all generate stays. You could obtain different results each time you execute a program with compatibility issues. These are heisenbugs, which are difficult to replicate, as contrasted to “bohrbugs,” which appear every time you look at them. Bohrbugs account for almost all bugs in linear development. Concurrency is difficult to master. However, do not let this deter you. We will examine rational techniques to build concurrent software that is secure from such types of issues during the following many readings.

Author(s): Adele Kuzmiakova
Publisher: Arcler Press
Year: 2023

Language: English
Pages: 260

Cover
Title Page
Copyright
ABOUT THE EDITOR
TABLE OF CONTENTS
List of Figures
List of Abbreviations
Preface
Chapter 1 Fundamentals of Concurrent, Parallel, and Distributed Computing
1.1. Introduction
1.2. Overview of Concurrent Computing
1.3. Overview of Parallel Computing
1.4. Overview Distributed Computing
References
Chapter 2 Evolution of Concurrent, Parallel, and Distributed Computing
2.1. Introduction
2.2. Evolution of Concurrent Computing
2.3. Evolution of Parallel Computing
2.4. Evolution of Distributed Computing
References
Chapter 3 Concurrent Computing
3.1. Introduction
3.2. Elements of the Approach
3.3. Linearizability in Concurrent Computing
3.4. Composition of Concurrent System
3.5. Progress Property of Concurrent Computing
3.6. Implementation
References
Chapter 4 Parallel Computing
4.1. Introduction
4.2. Multiprocessor Models
4.3. The Impact of Communication
4.4. Parallel Computational Complexity
4.5. Laws and Theorems of Parallel Computation
References
Chapter 5 Distributed Computing
5.1. Introduction
5.2. Association to Computer System Modules
5.3. Motivation
5.4. Association to Parallel Multiprocessor/Multicomputer Systems
5.5. Message-Passing Systems Vs. Shared Memory Systems
5.6. Primitives for Distributed Communication
5.7. Synchronous Vs. Asynchronous Executions
5.8. Design Problems and Challenges
References
Chapter 6 Applications of Concurrent, Parallel, and Distributed Computing
6.1. Introduction
6.2. Applications of Concurrent Computing
6.3. Applications of Parallel Computing
6.4. Applications of Distributed Computing
References
Chapter 7 Recent Developments in Concurrent, Parallel, and Distributed Computing
7.1. Introduction
7.2. Current Developments in Concurrent Computing
7.3. Recent Trends in Parallel Computing
7.4. Recent Trends in Distributed Computing
References
Chapter 8 Future of Concurrent, Parallel, and Distributed Computing
8.1. Introduction
8.2. Future Trends in Concurrent Computing
8.3. Future Trends in Parallel Computing
8.4. Future Trends in Distributed Computing
References
Index
Back Cover