The concept of concurrency is not an easy concept to grasp, when dealing with concurrency – the main enemy when doing any task in parallel are shared resources. This is why functional programming as a concept has gotten a lot of attention due to it’s potential in this modern age where there is an abundant amount of CPU core’s, making it much more scalable for parallel computing.
What about the Object Oriented Approach
? well, we’ve been dealing with concurrency problems for quite some time, and in .NET
, we have the we have the Task Parallel Library
(TPL). TPL makes concurrency patterns easier to implement. Let’s have a look at one of the most common problems in concurrency – race conditions.
A race condition happens when the output is dependant on the order of events. Race conditions usually happen when we are trying to write to a shared resource – the side-effect of a race condition is that the state of that shared resource is at an unpredictable state.
As a quick example, let’s say, we want to do append some strings.
var sb = new StringBuilder("1");
var t1 = Task.Factory.StartNew(() => { sb.Append("+"); });
var t2 = Task.Factory.StartNew(() => { sb.Append("1"); });
Task.WaitAll(new Task[] { t1, t2 });
Console.WriteLine(sb.ToString());
Now, if you run this, you may get the correct result of 1+1
most of the time, but it’s not guaranteed. This is a race condition. The problem in this case is the order of execution. We can fix this by controlling when operations are meant to execute:
var sb = new StringBuilder("1");
var t1 = Task.Factory.StartNew(() => { sb.Append("+"); });
var t2 = new Task(() => { sb.Append("1"); });
t1.ContinueWith((t) => t2.Start());
t2.Wait();
Console.WriteLine(sb.ToString());
Notice that t2 waits for t1 to finish first. This concept is the same in any concurrent applications, regardless of programming language used.
Let’s look at a race condition with a slightly different problem:
var sum = 0;
var t1 = Task.Factory.StartNew(() => sum += 10);
var t2 = Task.Factory.StartNew(() => sum += 20);
var t3 = Task.Factory.StartNew(() => sum += 20);
Task.WaitAll(new Task[] { t1, t2, t3 });
Console.WriteLine(sum);
Notice, that we don’t care about the order this time around, as we know that the resulting state would be the same regardless of the order of execution. Now, if you run this, chances are, you’ll get the expected result of 50
– but you run this enough times, you’ll sometimes get result that isn’t 50
. The race condition here is happening in accessing and changing sum
itself.
To guarantee this program’s correctness, we need to ensure that only one process can change the value of our shared resource sum
at any given point in time. We do this in .NET
using the language feature called lock
.
var sum = 0;
var obj = new Object();
var t1 = Task.Factory.StartNew(() => { lock (obj) { sum += 10; }});
var t2 = Task.Factory.StartNew(() => { lock (obj) { sum += 20; }});
var t3 = Task.Factory.StartNew(() => { lock (obj) { sum += 20; }});
Task.WaitAll(new Task[] { t1, t2, t3 });
Console.WriteLine(sum);
You can also solve this using Interlocked
.
var sum = 0;
var t1 = Task.Factory.StartNew(() => { Interlocked.Add(ref sum, 10); });
var t2 = Task.Factory.StartNew(() => { Interlocked.Add(ref sum, 20); });
var t3 = Task.Factory.StartNew(() => { Interlocked.Add(ref sum, 20); });
Task.WaitAll(new Task[] { t1, t2, t3 });
Console.WriteLine(sum);