What Is CPU Scheduling And Synchronization?

Back To Page


  Category:  OPERATING SYSTEM | 30th June 2025, Monday

techk.org, kaustub technologies

1. CPU Scheduling:

Definition:

CPU Scheduling Is The Process By Which The Operating System Decides Which Process In The Ready Queue Is To Be Allocated The CPU Next. It Plays A Critical Role In Achieving Efficient Process Execution, Responsiveness, And Optimal System Utilization.

Need For CPU Scheduling:

  • Multiple Processes May Be In The Ready State Simultaneously, But Only One Can Execute On The CPU At Any Given Time (in A Single-core System).
  • Efficient Scheduling Ensures Maximized CPU Utilization, Reduced Waiting Time, And Improved Throughput.

Types Of Schedulers:

1. Long-term Scheduler – Selects Processes From The Job Pool And Loads Them Into Memory.

2. Short-term Scheduler (CPU Scheduler) – Selects One Process From The Ready Queue For Execution.

3. Medium-term Scheduler – Handles Swapping Of Processes Between Main Memory And Disk.

CPU Scheduling Algorithms:

| Algorithm                           | Characteristics                  | Advantage                        | Disadvantage                    |
| ----------------------------------- | -------------------------------- | -------------------------------- | ------------------------------- |
| FCFS (First-Come, First-Served) | Non-preemptive                   | Simple To Implement              | Poor Average Waiting Time       |
| SJF (Shortest Job First)        | Non-preemptive Or Preemptive     | Optimal Average Waiting Time     | Difficult To Predict Job Length |
| Priority Scheduling             | Based On Priority Level          | Flexible                         | Starvation Problem              |
| Round Robin                     | Preemptive, Time Quantum-based   | Fair Allocation                  | Context Switching Overhead      |
| Multilevel Queue                | Multiple Queues By Priority/type | Good For Different Process Types | Rigid, Difficult Tuning         |

Performance Metrics:

  • CPU Utilization
  • Throughput
  • Turnaround Time
  • Waiting Time
  • Response Time

2. Synchronization:

Definition:

Synchronization Is The Process Of Coordinating The Execution Of Processes Such That Shared Resources (like Memory, Files, I/O Devices) Are Accessed Safely In A Concurrent Environment.

Need For Synchronization:

  • To Avoid Race Conditions
  • Ensure Data Consistency
  • Coordinate Process Execution Order

Critical Section Problem:

A Critical Section Is A Segment Of Code Where A Process Accesses Shared Resources. The Challenge Is To Design A Protocol To Ensure Mutual Exclusion So That No Two Processes Are In Their Critical Section At The Same Time.

Requirements Of Critical Section Problem:

1. Mutual Exclusion

2. Progress

3. Bounded Waiting

Synchronization Mechanisms:

Software Solutions:

  • Peterson’s Algorithm
  • Bakery Algorithm

Hardware Solutions:

  • Test-and-Set Instruction
  • Compare-and-Swap Instruction

OS-level Solutions (High-level APIs):

  • Semaphores
  • Mutex Locks
  • Monitors
  • Condition Variables

Example – Semaphore (Pseudocode):

c
Semaphore Mutex = 1;

Process P1:
  Wait(mutex);  

// Critical Section
  Signal(mutex);

Process P2:
  Wait(mutex);

// Critical Section
  Signal(mutex);

Common Synchronization Problems:

  • Producer-Consumer Problem
  • Dining Philosophers Problem
  • Readers-Writers Problem

Conclusion:

CPU Scheduling And Synchronization Are **fundamental Mechanisms In Modern Operating Systems**. Scheduling Ensures Efficient CPU Use And Responsiveness, While Synchronization Guarantees **safe Concurrent Access** To Shared Resources, Preventing Conflicts And Maintaining System Stability.

Tags:
CPU Scheduling And Synchronization, Definition Of CPU Scheduling And Synchronization,

Links 1 Links 2 Products Pages Follow Us
Home Founder Gallery Contact Us
About Us MSME Kriti Homeopathy Clinic Sitemap
Cookies Privacy Policy Kaustub Study Institute
Disclaimer Terms of Service