Page Content

Tutorials

What Is Mean By Multithreading In C#, Concepts & Components

Multithreading in C#

The programming notion of multithreading in C# enables the simultaneous execution of many processes or program segments. C# has language-level support for this functionality. In order to optimize CPU utilization and maintain application responsiveness, multithreading is primarily used to control the concurrent execution of multi-threaded programs, particularly those having user interfaces.

Core Concepts and Components

There are several essential elements that C# and the.NET Framework offer to make multithreaded programming easier:

Thread: The basic unit of software execution is called a thread. All C# programs start with a single main thread by default.

System.Threading Namespace: Important classes like Thread, ThreadStart, and Mutex are part of this namespace, which is vital to multithreading in.NET.

ThreadStart Delegate: Encapsulating the procedure that the new thread will perform, this predefined delegate is used exclusively for thread creation and initiation.

Task Parallel Library (TPL): The System’s public types and APIs make up the TPL.Threading.Tasks namespace is intended to increase developer productivity by streamlining the process of integrating concurrency and parallelism into programs. Parallel.ForEach, Parallel.For, and Parallel.Invoke are examples of such methods.

Async/Await Keywords: These keywords, which were first used in C# 5.0, make asynchronous programming easier, especially for activities that are I/O-bound (such file operations and web access). By freeing up the machine to perform other tasks, async/await can increase responsiveness. However, it’s crucial to remember that asynchronous does not always imply concurrent, parallel, or multi-threaded.

BackgroundWorker Component: With the help of this component, you may execute time-consuming tasks on a background thread in Windows Forms applications without interfering with the UI thread. Both cancellation and progress reports are supported.

Volatile Keyword: A field’s value may be altered by several different threads, as indicated to the compiler by the addition of volatile. By preventing compiler optimizations that presume only single-threaded access, it guarantees that the field’s most recent value is always utilized.

Thread Management and Life Cycle

There are several stages in a thread’s life cycle, and the Thread class offers methods to control them:

Creation: A thread is usually created by defining a function that the thread will run, creating a ThreadStart object that points to the method, instantiating a Thread object using the ThreadStart instance, and then using the Start() method on the Thread object.

Start(): Starts the thread’s execution process.

Sleep(int milliseconds): The current thread’s execution is halted for a predetermined amount of time using this static procedure.

Join(): Used to pause the active thread’s execution until the designated thread has finished.

Suspend() and Resume(): You can either suspend or restart a thread’s execution.

Abort(): Terminates a thread’s execution.

Thread Synchronization

Due to race circumstances, when several threads try to use the same shared resources or methods at the same time, the results may be unexpected or inaccurate. Thread synchronization is necessary to avoid these problems by guaranteeing that only one thread may access a shared resource or a crucial portion of code at once.

Typical methods of synchronization include of:

lock Keyword: Only one thread inside the same process may access a block of code at a time with the lock keyword, which ensures thread-safety. This is the most straightforward method of avoiding race circumstances in crucial passages. Locking on a private instance of a static object is a standard procedure.

Mutex Class: Thread synchronization is provided by the Mutex class via the WaitOne() and ReleaseMutex() methods. Only one thread at a time can access the code that is encased between these methods; this code is known as a synchronisation code.

Monitor Object: The lock keyword frequently makes advantage of the Monitor class’s synchronization techniques for accessing crucial portions.

Code Example: Multithreading and Synchronization

This example shows how to use C# multithreading by executing two methods at once. After that, it demonstrates the issue that occurs in the absence of synchronization and how thread safety may be attained by using the lock keyword:

using System;
using System.Threading;

namespace MultithreadingApplication
{
    class Program
    {
        // Shared resource that multiple threads will try to access and modify
        static int sharedCount = 0;
        static readonly object lockObject = new object(); // Object to lock on for synchronization

        // Method 1: Will print numbers from 1 to 10
        public static void Method1()
        {
            Console.WriteLine("Method1 starts");
            for (int i = 1; i <= 10; i++)
            {
                Console.WriteLine("Method1: " + i);
                Thread.Sleep(100); // Simulate some work
            }
            Console.WriteLine("Method1 ends");
        }

        // Method 2: Will print numbers from 11 to 20
        public static void Method2()
        {
            Console.WriteLine("Method2 starts");
            for (int k = 11; k <= 20; k++)
            {
                Console.WriteLine("Method2: " + k);
                Thread.Sleep(100); // Simulate some work
            }
            Console.WriteLine("Method2 ends");
        }

        // Method to demonstrate a potential race condition without lock
        public static void DangerousIncrement()
        {
            // This is dangerous if multiple threads call it simultaneously
            // because read-modify-write is not atomic.
            int temp = sharedCount;
            Console.WriteLine($"Thread {Thread.CurrentThread.ManagedThreadId}: Reading sharedCount = {temp}");
            Thread.Sleep(50); // Small delay to increase chance of context switch
            sharedCount = temp + 1;
            Console.WriteLine($"Thread {Thread.CurrentThread.ManagedThreadId}: Incrementing sharedCount to {sharedCount}");
        }

        // Method to demonstrate thread-safe increment using lock
        public static void SafeIncrement()
        {
            lock (lockObject) // Ensure only one thread executes this block at a time [29, 30]
            {
                int temp = sharedCount;
                Console.WriteLine($"Thread {Thread.CurrentThread.ManagedThreadId}: Entering lock, reading sharedCount = {temp}");
                Thread.Sleep(50); // Simulate some work
                sharedCount = temp + 1;
                Console.WriteLine($"Thread {Thread.CurrentThread.ManagedThreadId}: Exiting lock, incrementing sharedCount to {sharedCount}");
            }
        }


        static void Main(string[] args)
        {
            Console.WriteLine("--- Demonstrating Concurrent Execution ---");
            // Step 1: Create ThreadStart delegates for the methods
            ThreadStart ts1 = new ThreadStart(Method1); 
            ThreadStart ts2 = new ThreadStart(Method2); 

            // Step 2: Create Thread objects
            Thread t1 = new Thread(ts1); 
            Thread t2 = new Thread(ts2); 

            // Step 3: Start the threads
            t1.Start(); 
            t2.Start(); 

            // Wait for threads to complete to ensure Main doesn't exit prematurely
            t1.Join(); // Blocks the calling thread (Main) until t1 completes 
            t2.Join(); // Blocks the calling thread (Main) until t2 completes 
            Console.WriteLine("--- Concurrent Execution Completed ---");

            Console.WriteLine("\n--- Demonstrating Race Condition (Dangerous) ---");
            sharedCount = 0; // Reset shared count
            Thread d_t1 = new Thread(DangerousIncrement);
            Thread d_t2 = new Thread(DangerousIncrement);
            d_t1.Start();
            d_t2.Start();
            d_t1.Join();
            d_t2.Join();
            Console.WriteLine($"Final sharedCount (Dangerous): {sharedCount}"); // Might be 1 instead of 2

            Console.WriteLine("\n--- Demonstrating Thread Safety with lock ---");
            sharedCount = 0; // Reset shared count
            Thread s_t1 = new Thread(SafeIncrement);
            Thread s_t2 = new Thread(SafeIncrement);
            s_t1.Start();
            s_t2.Start();
            s_t1.Join();
            s_t2.Join();
            Console.WriteLine($"Final sharedCount (Safe): {sharedCount}"); // Should be 2

            Console.ReadKey();
        }
    }
}

Output

--- Demonstrating Concurrent Execution ---
Method1 starts
Method2 starts
Method1: 1
Method2: 11
Method1: 2
Method2: 12
Method1: 3
Method2: 13
Method1: 4
Method2: 14
Method1: 5
Method2: 15
Method2: 16
Method1: 6
Method2: 17
Method1: 7
Method2: 18
Method1: 8
Method2: 19
Method1: 9
Method2: 20
Method1: 10
Method2 ends
Method1 ends
--- Concurrent Execution Completed ---

--- Demonstrating Race Condition (Dangerous) ---
Thread 5: Reading sharedCount = 0
Thread 6: Reading sharedCount = 0
Thread 6: Incrementing sharedCount to 1
Thread 5: Incrementing sharedCount to 1
Final sharedCount (Dangerous): 1

--- Demonstrating Thread Safety with lock ---
Thread 7: Entering lock, reading sharedCount = 0
Thread 7: Exiting lock, incrementing sharedCount to 1
Thread 8: Entering lock, reading sharedCount = 1
Thread 8: Exiting lock, incrementing sharedCount to 2
Final sharedCount (Safe): 2

Explanation of the example:

Concurrent Execution: A distinct method (Method1 and Method2) is assigned to each of the two Thread objects created by the Main method, t1 and t2. Both t1.Start() and t2.Start() start running when they are called, and their output will interleave to show that they are operating concurrently. Before continuing, the Main thread waits for t1 and t2 to finish, with the Join() methods.

Race Condition (Dangerous): An attempt is made to increase a sharedCount variable using the DangerousIncrement technique. If d_t1 and d_t2 execute this method at around the same time without any synchronisation, they may both read the identical initial value of sharedCount, increment it, and then write back their (erroneous) result, resulting in sharedCount being 1 rather than 2 after both threads have run. The read-modify-write operation (temp = sharedCount; sharedCount = temp + 1;) is not atomic, which explains this.

Thread Safety with lock: Using the lock(lockObject) statement is how the SafeIncrement function works. This guarantees that only one thread can ever enter the code block that the lock is protecting. If s_t1 gets inside the lock block, s_t2 will wait for s_t1 to get out. By ensuring that read, modify, and write operations on sharedCount are carried out atomically, the final value of 2 is guaranteed to be accurate.

You can also read What Are The C# Identifiers, Rules for Creating Identifiers

Agarapu Geetha
Agarapu Geetha
My name is Agarapu Geetha, a B.Com graduate with a strong passion for technology and innovation. I work as a content writer at Govindhtech, where I dedicate myself to exploring and publishing the latest updates in the world of tech.
Index