Session 10

Concurrency

January 6th, 2018

In this session we will talk about concurrency, so what is concurrency in programming language concept ?

Concurrecy is refers to the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome.

Concurrency can occur at four levels:

–Machine instruction level

–High-level language statement level

–Unit level

–Program level

 

Introduction to Subprogram-Level Concurrency

A task or process or thread is a program unit that can be in concurrent execution with other program units

Tasks differ from ordinary subprograms in that:

–A task may be implicitly started

–When a program unit starts the execution of a task, it is not necessarily suspended

–When a task’s execution is completed, control may not return to the caller

Tasks usually work together

Two General Categories of Tasks

  • Heavyweight tasks execute in their own address space
  • Lightweight tasks all run in the same address space – more efficient
  • A task is disjoint if it does not communicate with or affect the execution of any other task in the program in any way

Task Synchronization

A mechanism that controls the order in which tasks execute

Two kinds of synchronization

  • Cooperation: Task A must wait for task B to complete some specific activity before task A can continue its execution, e.g., the producer-consumer problem
  • Competition: Two or more tasks must use some resource that cannot be simultaneously used, e.g., a shared counter

–Competition is usually provided by mutually exclusive access  (approaches are discussed later)

 

Scheduler

  • Providing synchronization requires a mechanism for delaying task execution
  • Task execution control is maintained by a program called the scheduler, which maps task execution onto available processors

Task Execution States

  • New – created but not yet started
  • Ready – ready to run but not currently running (no available processor)
  • Running
  • Blocked – has been running, but cannot now continue (usually waiting for some event to occur)
  • Dead – no longer active in any sense

Liveness and Deadlock

  • Liveness is a characteristic that a program unit may or may not have
    – In sequential code, it means the unit will
    eventually complete its execution
  • If all tasks in a concurrent environment lose their liveness, it is called deadlock

Design Issues for Concurrency

  • Competition and cooperation synchronization*
  • Controlling task scheduling
  • How can an application influence task scheduling
  • How and when tasks start and end execution
  • How and when are tasks created

* The most important issue

Methods of Providing Synchronization

  • Semaphores
    A semaphore is a data structure consisting of a counter and a queue for storing task descriptors
  • Monitor
    A monitor is an abstract data type for shared data
  • Message Passing
    Message passing is a general model for concurrency
    –It can model both semaphores and monitors–It is not just for competition synchronization

Multiple Entry Points

–The specification task has an entry clause for each

–The task body has an accept clause for each entry clause, placed in a select clause, which is in a loop

This entry was posted in Session Summary. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *