Concurrent execution means being able to run more than one program on a single machine or a single processor.
These seperate streams of execution are typically referred to by a variety of names:
Threads are a topic of discussion in this class, because their properties and functions can be nicely wrapped up into an object and treated as an ADT, as we shall soon see.
Often, two or more threads of execution need to use the same resource. In order to avoid undesirable effects, some means of synchronizing operations between threads needs to be in place.
The following are examples of shared resources:
Let's use the example of a linked list which has two threads working on it, one which adds nodes to the end of the list, and one which reads nodes at the front of the list, processes them, and deletes them from the front of the list. This kind of situation is typically referred to as a "producer/consumer" relationship.
You can probably already see the potential problem in this scenario: What happens when the producer isn't working as quickly as the consumer? It is entirely possible that the consumer would attempt to read and/or delete a node from the list when the list is completely empty! How does one prevent this from occurring? Through synchronization.
So, continuing with our previous example, the producer thread (the one that adds nodes to the list), can freely go about it's job without any interruption. It will, however, need to tell the consumer thread (the one reading / deleting nodes from a list), when it can and can't go about it's business. The producer thread would do this by testing if the list was empty, and if so, tell the consumer thread to wait. Likewise, after the producer thread has added a few nodes to the list, it would notify the consumer thread and wake it up. In addition, just before the consumer thread attempts to read/delete a node from the list, it should also test if it is empty and if so, put itself to sleep.
Using synchronization techniques is not only useful, it is often vital. If you do not synchronize access to data it can create a Race Condition. Simply put, a race condition is when two threads try to access a shared resource at the same time and end up damaging/corrupting the resource.
Let's say that in our previous example of the consumer/producer threads acting on the linked list, we did not regulate their access to the list, and just let them do whatever they wanted to the list, at any time. Well, it might work okay for a little while, but I guarantee you that sooner or later, both of those threads are going to try to modify the list, they're going to delete the memory right out from under the other one, and they will Foul Things Up Beyond All Recognition. Guarantee it.
Bottom line: Use synchronization to avoid race conditions.
There is one other noteworthy point that should be introduced: Deadlocks. Simply put, a deadlock is created when two or more threads are put to sleep and never woken up. This typically is a result of poor handling of shared resources.
Continuing from our ongoing example above, let's say the consumer thread told the producer thread to wait while it is reading and deleting nodes--just to avoid any possible collisions that might occur with having both processes manipulating the data structure. With the producer thread put to sleep, it is no longer adding nodes to the linked list. Once the consumer thread gets done reading all the nodes in the list, it will likewise put itself to sleep and wait to be awoken by the producer thread when it has added some more nodes. The problem is, the producer thread is asleep and no longer producing nodes! Bottom line: Both threads are asleep, nothing is getting done, and the program is locked up.
To avoid this unfortunate circumstance, you must be careful about when you tell your threads to go to sleep, and be sure to wake them up. Double-check your code and your logic, and step through the code in a debugger (or pepper the code with print statements if necessary). Another approach is to give your threads a timeout value, after which the threads will wake up. There really is no silver bullet here, you just have to be careful.
If you are interested in finding more information on how concurrent execution / threading works, I would reccommend the following:
To begin, concurrent programming (implemented as "Tasks" in Ada) is one of the nicest things about the language. The implementation is very clean and straightforward. If you're still cringing from the polymorphism stuff in the previous chapter, you can relax now and enjoy this stuff.
Ada Threads - Tasks: Tasks are introduced on page 726. Be sure to read the bulleted list on that same page which tells what a Task is and isn't and how it is similar to constructs you have already learned. The example program which follows is likewise instructive. Note that tasks are started implicitly if no entry point is specified (more on that below). On page 728 we learn that you can decleare two instances of the same type of Task.
Timeslicing - The DELAY keyword: See the section entitled "Cooperating Tasks" that begins in the middle of page 729 and goes on through page 730. This is like a sleep() statement in C and simply allows the program to time-slice task execution a little more nicely.
Entry Points - The ACCEPT keyworkd: Look at the section that begins at the bottom of page 730 entitled "Controlling the Starting Order of Tasks", and goes on till the top of page 734. Here you see how you can specify a kind of "method" that will prevent the thread from starting execution immediately, and wait until you call the ACCEPT method. The example that follows is illuminating.
The book also mentions a SELECT keyword. Don't worry about it.
The "gray-box" syntax for all this is on page 733.
The way tasks are synchronized/blocked in Ada is via the PROTECTED keyword. (I realize this will trouble a lot of you C++ fans out there--Take it like a man.) The section that describes it begins on page 734 and goes on through to page 739. It uses an example of drawing to the screen to demonstrate the need for synchronization. We will likely go over this in class. Again, note how clean and straightforward this implementation of synchronization is. I guess anybody can have a good day.
What you should take away from this reading is that a PROTECTED type is a "mini-package" full of methods whose entry will be restricted to a single thread. This is an example of the 'Synchronization Blocks' approach described above in the theory section.
Write a program which contains two tasks that will act on a single floating-point number.
You could even use the same task declaration for both tasks and pass it the appropriate value for the DELAY statement as a parameter. In fact, I encourage you to code it this way.
You might want to make a seperate routine to print the value to the screen and put it inside a PROTECTED type. It's up to you.
Update: I have leared that Tasks in the GNAT compiler on the Zonker server are broken. Your programs will compile, but if you try to run them, they will segfault. Since this is the case, please simply complete the program, get it to compile, and leave it for me to grade. I will only be looking at the source anyway, so it doesn't matter if your program won't run. I apologize for this inconvenience.
This (last) assignment is due on December 10th, '98. This is the same night as the Final Exam! Note also that if you are planning on turning in an Extra Credit report, it is also due on December 10th.