Parallel programming has been revolutionised in .NET 4, providing, for the first time, a standardised and simplified method for creating robust, scalable and reliable multi-threaded applications. The Parallel programming features of .NET 4 allow the programmer to create applications that harness the power of multi-core and multi-processor machines. Simpler to use and more powerful than “classic” .NET threads, parallel programming allows the developer to remain focused on the work an application needs to perform. In Pro .NET 4 Parallel Programming in C#, Adam Freeman presents expert advice that guides you through the process of creating concurrent C# applications from the ground up. You’ll be introduced to .NET’s parallel programming features, both old and new, discover the key functionality that has been introduced in .NET 4, and learn how you can take advantage of the power of multi-core and multi-processor machines with ease. Pro .NET 4 Parallel Programming in C# is a reliable companion that will remain with you as you explore the parallel programming universe, elegantly and comprehensively explaining all aspects of parallel programming, guiding you around potential pitfalls and providing clear-cut solutions to the common problems that you will encounter.
Author(s): Adam Freeman
Publisher: Apress
Year: 2010
Language: English
Pages: 328
Prelim
Contents at a Glance
Contents
About the Author
About the Technical Reviewer
Acknowledgments
Introducing Parallel Programming
Introducing .NET Parallel Programming
What’s in This Book (and What Is Not)
Understanding the Benefits (and Pitfalls) of Parallel Programming
Considering Overhead
Coordinating Data
Scaling Applications
Deciding When to Go Parallel
Deciding When to Stay Sequential
Getting Prepared for This Book
Understanding the Structure of This Book
Getting the Example Code
Summary
Task Programming
Hello Task
Creating and Starting Tasks
Creating Simple Tasks
Setting Task State
Getting a Result
Specifying Task Creation Options
Identifying Tasks
Cancelling Tasks
Monitoring Cancellation by Polling
Monitoring Cancellation with a Delegate
Monitoring Cancellation with a Wait Handle
Cancelling Several Tasks
Creating a Composite Cancellation Token
Determining If a Task Was Cancelled
Waiting for Time to Pass
Using a Cancellation Token Wait Handle
Using Classic Sleep
Using Spin Waiting
Waiting for Tasks
Waiting for a Single Task
Waiting for Several Tasks
Waiting for One of Many Tasks
Handling Exceptions in Tasks
Handling Basic Exceptions
Using an Iterative Handler
Reading the Task Properties
Using a Custom Escalation Policy
Getting the Status of a Task
Executing Tasks Lazily
Understanding Common Problems and Their Causes
Task Dependency Deadlock
Solution
Example
Local Variable Evaluation
Solution
Example
Excessive Spinning
Solution
Example
Summary
Sharing Data
The Trouble with Data
Going to the Races
Creating Some Order
Executing Sequentially
Executing Immutably
Executing in Isolation
Synchronizing Execution
Defining Critical Regions
Defining Synchronization Primitives
Using Synchronization Wisely
Don’t Synchronize Too Much
Don’t Synchronize Too Little
Pick the Lightest Tool
Don’t Write Your Own Synchronization Primitives
Using Basic Synchronization Primitives
Locking and Monitoring
Using Interlocked Operations
Using Spin Locking
Using Wait Handles and the Mutex Class
Acquiring Multiple Locks
Configuring Interprocess Synchronization
Using Declarative Synchronization
Using Reader-Writer Locks
Using the ReaderWriterLockSlim Class
Using Recursion and Upgradable Read Locks
Working with Concurrent Collections
Using .NET 4 Concurrent Collection Classes
ConcurrentQueue
ConcurrentStack
ConcurrentBag
ConcurrentDictionary
Using First-Generation Collections
Using Generic Collections
Common Problems and Their Causes
Unexpected Mutability
Solution
Example
Multiple Locks
Solution
Example
Lock Acquisition Order
Solution
Example
Orphaned Locks
Solution
Example
Summary
Coordinating Tasks
Doing More with Tasks
Using Task Continuations
Creating Simple Continuations
Creating One-to-Many Continuations
Creating Selective Continuations
Creating Many-to-One and Any-To-One Continuations
Canceling Continuations
Waiting for Continuations
Handling Exceptions
Creating Child Tasks
Using Synchronization to Coordinate Tasks
Barrier
CountDownEvent
ManualResetEventSlim
AutoResetEvent
SemaphoreSlim
Using the Parallel Producer/Consumer Pattern
Creating the Pattern
Creating a BlockingCollection instance
Selecting the Collection Type
Creating the Producers
Creating the Consumer
Combining Multiple Collections
Using a Custom Task Scheduler
Creating a Custom Scheduler
Using a Custom Scheduler
Common Problems and Their Causes
Inconsistent/Unchecked Cancellation
Solution
Example
Assuming Status on Any-To-One Continuations
Solution
Example
Trying to Take Concurrently
Solution
Example
Reusing Objects in Producers
Solution
Example
Using BlockingCollection as IEnumerable
Solution
Example
Deadlocked Task Scheduler
Solution
Example
Summary
Parallel Loops
Parallel vs. Sequential Loops
The Parallel Class
Invoking Actions
Using Parallel Loops
Creating a Basic Parallel For Loop
Creating a Basic Parallel ForEach Loop
Setting Parallel Loop Options
Breaking and Stopping Parallel Loops
Handling Parallel Loop Exceptions
Getting Loop Results
Canceling Parallel Loops
Using Thread Local Storage in Parallel Loops
Performing Parallel Loops with Dependencies
Selecting a Partitioning Strategy
Using the Chunking Partitioning Strategy
Using the Ordered Default Partitioning Strategy
Creating a Custom Partitioning Strategy
Writing a Contextual Partitioner
Writing an Orderable Contextual Partitioner
Common Problems and Their Causes
Synchronization in Loop Bodies
Solution
Example
Loop Body Data Races
Solution
Example
Using Standard Collections
Solution
Example
Using Changing Data
Solution
Example
Summary
Parallel LINQ
LINQ, But Parallel
Using PLINQ Queries
Using PLINQ Query Features
Ordering Query Results
Using Ordered Subqueries
Performing a No-Result Query
Managing Deferred Query Execution
Controlling Concurrency
Forcing Parallelism
Limiting Parallelism
Forcing Sequential Execution
Handling PLINQ Exceptions
Cancelling PLINQ Queries
Setting Merge Options
Using Custom Partitioning
Using Custom Aggregation
Generating Parallel Ranges
Common Problems and Their Causes
Forgetting the PLINQ Basics
Solution
Creating Race Conditions
Solution
Example
Confusing Ordering
Solution
Example
Sequential Filtering
Solution
Example
Summary
Testing and Debugging
Making Things Better When Everything Goes Wrong
Measuring Parallel Performance
Using Good Coding Strategies
Using Synchronization Sparingly
Using Synchronization Readily
Partitioning Work Evenly
Avoiding Parallelizing Small Work Loads
Measure Different Degrees of Concurrency
Making Simple Performance Comparisons
Performing Parallel Analysis with Visual Studio
Finding Parallel Bugs
Debugging Program State
Handling Exceptions
Detecting Deadlocks
Summary
Common Parallel Algorithms
Sorting, Searching, and Caching
Using Parallel Quicksort
The Code
Using the Code
Traversing a Parallel Tree
The Code
Using the Code
Searching a Parallel Tree
The Code
Using the Code
Using a Parallel Cache
The Code
Using the Code
Using Parallel Map and Reductions
Using a Parallel Map
The Code
Using the Code
Using a Parallel Reduction
The Code
Using the Code
Using Parallel MapReduce
The Code
Using the Code
Speculative Processing
Selection
The Code
Using the Code
Speculative Caching
The Code
Using the Code
Using Producers and Consumers
Decoupling the Console Class
The Code
Using the Code
Creating a Pipeline
The Code
Using the Code
Index
¦A
¦B
¦C
D
¦
¦E
¦F
¦G
¦H
¦I
¦J
¦M
¦K
¦L
P
¦
¦N
¦O
R
¦
¦Q
¦S
T
¦
¦U
¦V
¦W