Microsoft.NET

……………………………………………….Expertise in .NET Technologies

Multithreading

Posted by Ravi Varma Thumati on October 27, 2009

Multithreading is a powerful tool for creating high performance applications, especially those that require user interaction.

Definition of Threads:

  In order to understand multithreading we have to first understand threads. Each application/program that is running on a system is a process. The process for each program consists of one or more threads that are given processor time to do the work. Each thread contains all of the context information required by the system in order to execute the program instructions.

               The operating system is responsible for scheduling and execution of threads. Remember the days of Windows 3.x? A single process thread could, and usually did, monopolize all of the processor time. The system would just sit and wait for the thread to complete prior to executing any other processes.

 Newer operating systems, such as Windows 2000, support pre-emptive multitasking, which allocates each thread a time slice. When the time slice of the currently executing thread has elapsed, the thread is suspended by the operating system, context of the thread is saved, context of another thread is loaded, and the other thread then resumes execution according to its previous state. This gives the appearance that multiple threads are executing at the same time and helps prevent the system from becoming unresponsive from a single thread (end task anyone?). On systems that have more that one-processor threads are distributed across all of the processors so there really are multiple threads executing at the same time.

Thread States: Life Cycle of a Thread

At any time, a thread is said to be in one of several thread states. This section discusses these states and the transitions between states. Two classes critical for multithreaded applications are Thread and Monitor (System.Threading namespace). This section also discusses several methods of classes Thread and Monitor that cause state transitions.

A new thread begins its lifecyle in the Unstarted state. The thread remains in the Unstarted state until the program calls Thread method Start, which places the thread in the Started state (sometimes called the Ready or Runnable state) and immediately returns control to the calling thread. Then the thread that invoked Start, the newly Started thread and any other threads in the program execute concurrently.

Thread life cycle.

The highest priority Started thread enters the Running state (i.e., begins executing) when the operating system assigns a processor to the thread (Section 12.3 discusses thread priorities). When a Started thread receives a processor for the first time and becomes a Running thread, the thread executes its ThreadStart delegate, which specifies the actions the thread will perform during its lifecyle. When a program creates a new Thread, the program specifies the Thread‘s ThreadStart delegate as the argument to the Thread constructor. The ThreadStart delegate must be a method that returns void and takes no arguments.

A Running thread enters the Stopped (or Dead) state when its ThreadStart delegate terminates. Note that a program can force a thread into the Stopped state by calling Thread method Abort on the appropriate Thread object. Method Abort throws a ThreadAbortException in the thread, normally causing the thread to terminate. When a thread is in the Stopped state and there are no references to the thread object, the garbage collector can remove the thread object from memory.

multi threading1

A thread enters the Blocked state when the thread issues an input/output request. The operating system blocks the thread from executing until the operating system can complete the I/O for which the thread is waiting. At that point, the thread returns to the Started state, so it can resume execution. A Blocked thread cannot use a processor even if one is available.

There are three ways in which a Running thread enters the WaitSleepJoin state. If a thread encounters code that it cannot execute yet (normally because a condition is not satisfied), the thread can call Monitor method Wait to enter the WaitSleepJoin state. Once in this state, a thread returns to the Started state when another thread invokes Monitor method Pulse or PulseAll. Method Pulse moves the next waiting thread back to the Started state. Method PulseAll moves all waiting threads back to the Started state.

A Running thread can call Thread method Sleep to enter the WaitSleepJoin state for a period of milliseconds specified as the argument to Sleep. A sleeping thread returns to the Started state when its designated sleep time expires. Sleeping threads cannot use a processor, even if one is available.

Any thread that enters the WaitSleepJoin state by calling Monitor method Wait or by calling Thread method Sleep also leaves the WaitSleepJoin state and returns to the Started state if the sleeping or waiting Thread‘s Interrupt method is called by another thread in the program.

If a thread cannot continue executing (we will call this the dependent thread) unless another thread terminates, the dependent thread calls the other thread’s Join method to “join” the two threads. When two threads are “joined,” the dependent thread leaves the WaitSleepJoin state when the other thread finishes execution (enters the Stopped state). If a Running Thread‘s Suspend method is called, the Running thread enters the Suspended state. A Suspended thread returns to the Started state when another thread in the program invokes the Suspended thread’s Resume method.

Thread Priorities and Thread Scheduling

Every thread has a priority in the range between ThreadPriority.Lowest to ThreadPriority.Highest. These two values come from the ThreadPriority enumeration (namespace System.Threading). The enumeration consists of the values Lowest, BelowNormal, Normal, AboveNormal and Highest. By default, each thread has priority Normal.

The Windows operating system supports a concept, called timeslicing, that enables threads of equal priority to share a processor. Without timeslicing, each thread in a set of equal-priority threads runs to completion (unless the thread leaves the Running state and enters the WaitSleepJoin, Suspended or Blocked state) before the thread’s peers get a chance to execute. With timeslicing, each thread receives a brief burst of processor time, called a quantum, during which the thread can execute. At the completion of the quantum, even if the thread has not finished executing, the processor is taken away from that thread and given to the next thread of equal priority, if one is available.

The job of the thread scheduler is to keep the highest-priority thread running at all times and, if there is more than one highest-priority thread, to ensure that all such threads execute for a quantum in round-robin fashion (i.e., these threads can be time sliced). Figure 12.2 illustrates the multilevel priority queue for threads. In Fig. 12.2, assuming a single-processor computer, threads A and B each execute for a quantum in round-robin fashion until both threads complete execution. This means that A gets a quantum of time to run. Then B gets a quantum. Then A gets another quantum. Then B gets another quantum. This continues until one thread completes. The processor then devotes all its power to the thread that remains (unless another thread of that priority is Started). Next, thread C runs to completion. Threads D, E and F each execute for a quantum in round-robin fashion until they all complete execution. This process continues until all threads run to completion. Note that, depending on the operating system, new higher-priority threads could postpone—possibly indefinitely—the execution of lower-priority threads. Such indefinite postponement often is referred to more colorfully as starvation.

Thread-priority scheduling.

 multi threading2

A thread’s priority can be adjusted with the Priority property, which accepts values from the ThreadPriority enumeration. If the argument is not one of the valid thread-priority constants, an ArgumentException occurs. A thread executes until it dies, becomes Blocked for input/output (or some other reason), calls Sleep, calls Monitor method Wait or Join, is preempted by a thread of higher priority or has its quantum expire. A thread with a higher priority than the Running thread can become Started (and hence preempt the Running thread) if a sleeping thread wakes up, if I/O completes for a thread that Blocked for that I/O, if either Pulse or PulseAll is called on an object on which Wait was called, or if a thread to which the high-priority thread was Joined completes.

Threading Model Background:

             There are many threading models available that are common to the Microsoft Win32 based environments.

Single Threaded

Single threaded means there is only one thread within the process and it is doing all of the work for the process. The process must wait for the current execution of the thread to complete before it can perform another action.

          Single threaded results in system idle time and user frustration. For example, assume we are saving a file to a remote network using a single threaded application. Since there is only a single thread in the application, the application will not be able to do anything else while the file is being stored in the remote location. Thus the user waits and begins to wonder if the application is ever going to resume.

Apartment Threading (Single Threaded Apartment)

         Apartment threaded means there are multiple threads within the application. In single threaded apartment (STA) each thread is isolated in a separate apartment underneath the process. The process can have any number of apartments that share data through a proxy. The application defines when and for how long the thread in each apartment should execute. All requests are serialized through the Windows message queue such that only a single apartment is accessed at a time and thus only a single thread will be executing at any one time. STA is the threading model that most Visual Basic developers are familiar with because this is the threading model available to VB applications prior to VB.NET. You can think of it like an apartment building full of a row of one-room apartments that are accessible one at a time through a single hallway. The advantage this provides over single threaded is that multiple commands can be issued at one time instead of just a single command, but the commands are still sequentially executed.

Free Threading (Multi Threaded Apartment)

Free threaded applications were limited to programming languages such as C++ until the release of Microsoft .NET. The free threaded/Multi Threaded Apartment (MTA) model has a single apartment created underneath the process rather than multiple apartments. This single apartment holds multiple threads rather than just a single thread. No message queue is required because all of the threads are a part of the same apartment and can share data without a proxy. You can think of it like a building with multiple rooms that are all accessible once you are inside the building. These applications typically execute faster than single threaded and STA because there is less system overhead and can be optimized to eliminate system idle time.

These types of applications are more complex to program. The developer must provide thread synchronization as part of the code to ensure that threads do not simultaneously access the same resources. A condition known as a race condition can occur when a thread accesses a shared resource and modifies the resource to an invalid state and then another thread accesses the shared resource and uses it in the invalid state before the other thread can return the resource to a valid state. Therefore it is necessary to place a lock on a resource to prevent other threads from accessing the resource until the lock has been removed. However, this can lead to a deadlock situation where two threads are competing for resources and neither can proceed. For example, thread #1 has a resource locked and is waiting for another resource that is currently locked by thread #2. Thread #2 happens to be waiting for the resource locked by thread #1. Thus, both threads are waiting on the other and neither will be allowed to proceed. The only way to avoid situations like these is through good design and testing.

Using Multiple Threads

There are advantages and disadvantages to the use of multiple threads. Using multiple threads is not always advantageous, so you should make the determination for yourself whether or not there is enough benefit for your application(s).

Advantages of Multiple Threads

  • Improved responsiveness — when it comes to application performance I am a firm believer that perception is reality. If the user perceives that my application is slow, then it that means it is in reality slow. If the application is performing operations that take a perceivably long time to complete, these operations can be put into a separate thread, which will allow the application to continue to be responsive to the user.
  • Faster application — multiple threads can lead to improved application performance. For example, if there are a number of calculations to be performed or the contents of a file are being processed, then there the application can be made faster by performing multiple operations at the same time.
  • Prioritization — Threads can be assigned a priority, which would allow higher priority tasks to take precedence over lower priority tasks.

Disadvantages of Multiple Threads

  • Programming and debugging is more complex — With multithreaded applications the programmer must account for race conditions and deadlocks.
  • Threads add overhead to the system — In order for the operating system to track a large number of threads it is going to consume processor time. If there are too many threads then each thread may not be given enough time to execute during its time slice. In addition, each thread is scheduled for execution less frequently due to the volume and time slice committed to each thread.

Threading Example

To demonstrate the use of multiple threads we will use a sample windows application that will show how a user interface can be made to respond differently when performing the same action.

Sample User Interface

The application has a user interface containing a list box. When either the “Run Non-threaded” or “Run Multithreaded” buttons are pressed the buttons will be temporarily disabled and the list box will be filled with “Hello World” messages. There will be a processing delay thrown in between each entry in the list box. The interface also contains a button to test the interface responsiveness while the list box is being populated. The sample user interface is as follows:

      

                  

Testing the Sample

Run the sample application and press the “Run Nonthreaded” button. While the process is running try pressing the “Test Responsiveness” button. You’ll notice the user interface is not updating the list as items are added, nor is the response counter updated when you test the responsiveness. Once processing is complete, the list box will populate and the test responsiveness actions will be executed.

Now press the “Run Multithreaded” button. While the process is running you’ll immediately notice that the list box is updating as the process executes. As you press the “Test Responsiveness” button you will notice the button is responsive and the counter updates as you click on the button.

Timer Threads:

          A common use of threads is for all kinds of periodical updates. Under Win32 we have two different ways: window timers and time limited waiting for events. . NET offers three different ways:

  • Windows timers with the System.WinForms.Timer class
  • Periodical delegate calling with System.Threading.Timer class (works on W2K only)
  • Exact timing with the System.Timers.Timer class

For inexact timing we use the window timer. Events raised from the window timer go through the message pump (together with all mouse events and UI update messages) so they are never exact. The simplest way for creating a WinForms.Timer is by adding a Timer control onto a form and creating an event handler using the control’s properties. We use the Interval property for setting the number of milliseconds between timer ticks and the Start method to start ticking and Stop to stop ticking. Be careful with stopping, because stopped timers are disabled and are subject to garbage collection. That means that stopped timers cannot be started again.

                          The System.Threading.Timer class is a new waiting thread in the thread pool that periodically calls supplied delegates. Currently it works on Windows 2000 only.

                       For wait Time and period Time you can use Time Span types if it makes more sense. In the supplied sample you can see that the same state object is used on both timers. When you run the sample the state values printed are not consecutive. I left it like this to show how important synchronization is in a multithread environment.

The final timing options come from the System.Timers.Timer class. It represents server-based timer ticks for maximum accuracy. Ticks are generated outside of our process and can be used for watchdog control. In the same System.Timers namespace you can find the Schedule class, which gives you the ability to schedule timer events, fired at longer time intervals.

System.Timers.Timer class is the most complete solution for all time-fired events. It gives you the most precise control and timing and is surprisingly simple to use.

Thread Pooling:

             The idea for making a pool of threads on the .NET framework level comes from the fact that most threads in multithreaded programs spend most of the time waiting for something to happen. It means that thread entry functions contain endless loops, which calls real working functions. By using the Thread Pool type object preparing working functions is simpler and for bonus we get better resource usage.

There are two important facts relating to Thread Pool object.

  • There is only one Thread Pool type object per process
  • There is only one working thread per thread pool object

The most useful use of a Thread Pool object is to add a new thread with a triggering event to the thread pool. i.e.. “When this event happens do this”.

A less common use of a thread pool will be (or at least should be, be aware of misuse) adding threads to be executed when the processor is free. You could think of this kind of usage as “OK, I have this to do, so do it whenever you have time”. Very democratic way of handling background processing which can stop your program quickly. Remember that inside the thread pool you have only one thread working (per processor).

Some notes on thread pool:

  • Don’t use a thread pool for threads that perform long calculations. You only have one thread per processor actually working.
  • System.Thread.Timer type objects are one thread inside thread pool
  • You have only one thread pool per process
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: