A definitive guide to mastering and implementing concurrency patterns in your applications Key Features Build scalable apps with patterns in multithreading, synchronization, and functional programming Explore the parallel programming and multithreading techniques to make the code run faster Efficiently use the techniques outlined to build reliable applications Book Description Selecting the correct concurrency architecture has a significant impact on the design and performance of your applications. This book explains how to leverage the different characteristics of parallel architecture to make your code faster and more efficient. To start with, you'll understand the basic concurrency concepts and explore patterns around explicit locking, lock free programming, futures & actors. Then, you'll get insights into different concurrency models and parallel algorithms and put them to practice in different scenarios to realize your application's true potential. We'll take you through multithreading design patterns, such as master, slave, leader, follower, map-reduce, and monitor, also helping you to learn hands-on coding using these patterns. Once you've grasped all of this, you'll move on to solving problems using synchronizer patterns. You'll discover the rationale for these patterns in distributed & parallel applications, followed by studying how future composition, immutability and the monadic flow help create more robust code. Toward the end of the book, you'll learn about the actor paradigm and actor patterns - the message passing concurrency paradigm. What you will learn Explore parallel architecture Get acquainted with concurrency models Internalize design themes by implementing multithreading patterns Get insights into concurrent design patterns Discover design principles behind many java threading abstractions Work with functional concurrency patterns Who this book is for This is a must-have guide for developers who want to learn patterns to build scalable and high-performing apps. It's assumed that you already have a decent level of programming knowledge.
Author(s): Atul S. Khot
Publisher: Packt Publishing
Year: 2018
Language: English
Pages: 264
Cover
Title Page
Copyright and Credits
Packt Upsell
Contributors
Table of Contents
Preface
Chapter 1: Concurrency – An Introduction
Concurrency in a breeze
The push for concurrency
The MapReduce pattern
Fault tolerance
Time sharing
Two models for concurrent programming
The message passing model
Coordination and communication
Flow control
Divide and conquer
The concept of state
The shared memory and shared state model
Threads interleaving – the need for synchronization
Race conditions and heisenbugs
Correct memory visibility and happens-before
Sharing, blocking, and fairness
Asynchronous versus synchronous executions
Java's nonblocking I/O
Of patterns and paradigms
Event-driven architecture
Reactive programming
The actor paradigm
Message brokers
Software transactional memory
Parallel collections
Summary
Chapter 2: A Taste of Some Concurrency Patterns
A thread and its context
Race conditions
The monitor pattern
Thread safety, correctness, and invariants
Sequential consistency
Visibility and final fields
Double-checked locking
Safe publication
Initializing a demand holder pattern
Explicit locking
The hand-over-hand pattern
Observations – is it correct?
The producer/consumer pattern
Spurious and lost wake-ups
Comparing and swapping
Summary
Chapter 3: More Threading Patterns
A bounded buffer
Strategy pattern – client polls
Strategy – taking over the polling and sleeping
Strategy – using condition variables
Reader or writer locks
A reader-friendly RW lock
A fair lock
Counting semaphores
Our own reentrant lock
Countdown latch
Implementing the countdown latch
A cyclic barrier
A future task
Summary
Chapter 4: Thread Pools
Thread pools
The command design pattern
Counting words
Another version
The blocking queue
Thread interruption semantics
The fork-join pool
Egrep – simple version
Why use a recursive task?
Task parallelism
Quicksort – using fork-join
The ForkJoinQuicksortTask class
The copy-on-write theme
In-place sorting
The map-reduce theme
Work stealing
Active objects
Hiding and adapting
Using a proxy
Summary
Chapter 5: Increasing the Concurrency
A lock-free stack
Atomic references
The stack implementation
A lock-free FIFO queue
How the flow works
A lock-free queue
Going lock-free
The enque(v) method
The deq() method
Concurrent execution of the enque and deque methods
The ABA problem
Thread locals
Pooling the free nodes
The atomic stamped reference
Concurrent hashing
The add(v) method
The need to resize
The contains(v) method
The big lock approach
The resizing strategy
The lock striping design pattern
Summary
Chapter 6: Functional Concurrency Patterns
Immutability
Unmodifiable wrappers
Persistent data structures
Recursion and immutability
Futures
The apply method
by-name parameters
Future – thread mapping
Futures are asynchronous
Blocking is bad
Functional composition
Summary
Chapter 7: Actors Patterns
Message driven concurrency
What is an actor?
Let it crash
Location transparency
Actors are featherlight
State encapsulation
Where is the parallelism?
Unhandled messages
The become pattern
Making the state immutable
Let it crash - and recover
Actor communication – the ask pattern
Actors talking with each another
Actor communication – the tell pattern
The pipeTo pattern
Summary
Other Books You May Enjoy
Index