As CPUs are getting more and more cores and the growth rate of the speed of those cores is slowing down, using parallelism is becoming more important if you want to get the most out of the available processing power on a machine.

Several languages are already using different concepts to simplify parallel programming. The .Net framework and the C# language have also introduced new concepts to adresse parallel computing. Two of these are Task Parallel Library (formerly known as Parallel Extensions) and Parallel LINQ (also known as PLINQ).

Let’s look over an example on using both of these features. For my example I’ll use a class that calculates all primes up to a certain number. Such a class could be used for something like Project Euler. You will notice the algorithm used to calculate the primality of a number is the inefficient trial division algorithm. If you are truly working through Project Euler, you are better off implementing something like the Sieve of Eratosthenes.

public class PrimeNumberCalculator { public IEnumerable<int> CalculatePrimesUpTo(int maxValue) { return from number in Enumerable.Range(1, maxValue) where IsPrime(number) select number; } // uses inefficient trial division algorithm private bool IsPrime(int number) { if (number == 1) return false; var squareRoot = Math.Sqrt(number); for (int divisor = 2; divisor <= squareRoot; divisor++) { if (number % divisor == 0) return false; } return true; } }

I created a simple benchmark and measured how long it took to find all primes up to 3 000 000. On my computer this took 13.5408 seconds. If you have a multi-core cpu and you run this method, you will notice only one of the cores is doing all the heavy lifting during this operation.

Using the parallelism features, we can easily change our program to get it to work on multiple cores. Since the public method was using LINQ we can add a .AsParallel() extension method call to our existing LINQ query and get it working on multiple cores.

public IEnumerable<int> ParallelCalculatePrimesUpTo(int maxValue) { return from number in Enumerable.Range(1, maxValue).AsParallel() where IsPrime(number) select number; }

The change to the existing code is minimal but the difference in performance is huge. Using the same simple benchmark as before I get an execution time of 3.9384 seconds on my quad-core system. Looking at my task manager I can see the four cores are getting put through their paces.

While these examples use LINQ, we can also get similar results if the original method had used a **for** or **foreach** loop. In these cases, we could replace such loop with calls to the Parallel class of the TPL.

Here is a version of the CalculatePrimesUpTo method using a normal **foreach** loop instead of Linq and then using the Parallel class and it’s ForEach method.

public IEnumerable<int> ForEachCalculatePrimes(int maxValue) { var primeNumbers = new List<int>(); foreach (int number in Enumerable.Range(1, maxValue)) { if (IsPrime(number)) primeNumbers.Add(number); } return primeNumbers; } public IEnumerable<int> ParallelForEachCalculatePrimes(int maxValue) { var primeNumbers = new ConcurrentBag<int>(); Parallel.ForEach(Enumerable.Range(1, maxValue), number => { if (IsPrime(number)) primeNumbers.Add(number); }); return primeNumbers; }

As you can see, you can’t use the normal C# **foreach** keyword to do a parallel **foreach**. Instead a call is made to the Parallel.ForEach method. The method takes an IEnumerable and a delegate to execute on the the list.

These methods took 14.2428 seconds and 3.3540 seconds respectively.

And here’s how it would look using a **for** loop and Parallel.For :

public IEnumerable<int> ForCalculatePrimes(int maxValue) { var primeNumbers = new List<int>(); for (int number = 1; number <= maxValue; number++) { if (IsPrime(number)) primeNumbers.Add(number); } return primeNumbers; } public IEnumerable<int> ParallelForCalculatePrimes(int maxValue) { var primeNumbers = new ConcurrentBag<int>(); Parallel.For(1, maxValue + 1, number => { if (IsPrime(number)) primeNumbers.Add(number); }); return primeNumbers; }

The parallel methods are faster but are the results the same? If you just care about the prime numbers as an unordered set, then yes, but if you do care about the ordering then the answer is no.

For performance reasons, the order of the calls isn’t preserved by default. All parallel calls will be distributed to different threads without preserving the same order as in the non parallel version.

Here is the output of calculating all the prime numbers up to 20 with CalculatePrimesUpTo and ParallelCalculatePrimesUpTo :

Non parallel: 2, 3, 5, 7, 11, 13, 17, 19

Parallel: 3, 2, 5, 7, 11, 13, 17, 19

Close but no cigar. On such a low number of returned results, the order might just be identical if you run the test multiple times, but you can’t be certain of the order of the results.

If the order of the results is important, you can use the .AsOrdered() extension method.

return from number in Enumerable.Range(1, maxValue).AsParallel().AsOrdered() where IsPrime(number) select number;

The MSDN documentation states that using AsOrdered will result in a query that risks being slower. In my case, I have run several benchmarks at 3 000 000 method calls and the result have always fallen within less than 0.4 seconds of each other.

I increased the number of calls in the benchmark to 6 000 000 and then to 10 000 000 and noticed something weird, the AsOrdered calls were getting consistently faster then the non ordered calls at those thresholds.

I do not have any explanation for this phenomenon. If someone has an hypothesis, feel free to post a comment.

See part 2 for an update on why you shouldn’t use the List collection in the parallel examples.

**EDIT:** Regarding my last question, I posted it on StackOverflow here: Why is my AsOrdered PLINQ query faster than my unordered one. I will update the blog when I have new information.

That’s a pretty interesting article! I am scared though … there is no chance I keep my advance on Project Euler if you start using C# to solve the problems! 😉

Lol, no need to worry. I sadly haven’t touched Project Euler in a while. I’m working on something else right now that monopolizes my time, but I will make sure to try and overtake you sometime soon on Project Euler (or rather try to overtake you, I know if you had more time to spend on Project Euler you would breeze through most of the problems)!